Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Gpu memory buffer pre-fetch and pre-back signaling to avoid page-fault

A buffer and memory technology, applied in the direction of instruments, data conversion, image memory management, etc., can solve the problems of GPU processing inefficiency, lack of technology to stop and resume highly parallel operations, etc.

Inactive Publication Date: 2015-05-27
QUALCOMM INC
View PDF5 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

GPU processing inefficiencies can often occur during memory accesses due to the lack of techniques for stopping and resuming highly parallel jobs executing on the GPU

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Gpu memory buffer pre-fetch and pre-back signaling to avoid page-fault
  • Gpu memory buffer pre-fetch and pre-back signaling to avoid page-fault
  • Gpu memory buffer pre-fetch and pre-back signaling to avoid page-fault

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] The present invention relates to techniques for graphics processing, and more particularly to techniques for prefetch and prebackup signaling from a graphics processing unit for avoiding page faults in a virtual memory system.

[0018] A modern operating system (OS) running on a central processing unit (CPU) typically uses a virtual memory scheme to allocate memory to the multiple programs operating on the CPU. Virtual memory is a memory management technique that virtualizes a computer system's physical memory (eg, RAM, disk storage, etc.) such that application requirements refer to only one set of memory (ie, virtual memory). Virtual memory consists of contiguous address spaces that map to locations in physical memory. In this way, segments of physical memory are "hidden" from application programs, which can instead interact with contiguous blocks of virtual memory. Contiguous blocks in virtual memory are usually arranged into "pages." Each page is some fixed-length ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

This disclosure proposes techniques for demand paging for an IO device (e.g., a GPU) that utilize pre-fetch and pre-back notification event signaling to reduce latency associated with demand paging. Page faults are limited by performing the demand paging operations prior to the IO device actually requesting unbacked memory.

Description

technical field [0001] The present invention relates to techniques for graphics processing, and more particularly to techniques for prefetch and prebackup signaling from a graphics processing unit (GPU) for avoiding page faults in a virtual memory system. Background technique [0002] Visual content for display (eg, graphical user interfaces and content for video games) may be generated by a graphics processing unit (GPU). A GPU can convert two-dimensional or three-dimensional (3D) objects into a displayable two-dimensional (2D) pixel representation. Additionally, GPUs are increasingly being used to perform certain types of computations that are efficiently handled by the highly parallel nature of GPU cores. Such applications are sometimes referred to as general-purpose GPU (GPGPU) applications. Converting information about 3D objects into displayable bitmaps and large GPGPU applications requires considerable memory and processing power. GPU processing inefficiencies can ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T1/60
CPCG06T1/60G06F5/14
Inventor 科林·克里斯托弗·夏普戴维·里赫尔·加西亚·加西亚埃杜瓦杜斯·A·梅茨
Owner QUALCOMM INC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products