Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and apparatus for memory access scheduling to reduce memory access latency

A memory access and latency technology, applied in memory systems, instruments, memory address/allocation/relocation, etc., to solve problems such as memory system efficiency and performance degradation

Inactive Publication Date: 2010-02-03
INTEL CORP
View PDF4 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

If these free DRAM command bus slots are not otherwise used, then this can lead to memory system efficiency and performance degradation

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and apparatus for memory access scheduling to reduce memory access latency
  • Method and apparatus for memory access scheduling to reduce memory access latency
  • Method and apparatus for memory access scheduling to reduce memory access latency

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] The present invention generally relates to methods and apparatus for reducing latency due to memory controller scheduling and memory accesses. Referring to the drawings, exemplary embodiments of the present invention will now be described. The exemplary embodiments are provided to illustrate the invention and should not be construed as limiting the scope of the invention.

[0035] figure 2 An embodiment of the invention comprising system 200 is shown. System 200 includes central processing unit (CPU) and cache memory 210 , memory controller 220 , input / output (I / O) 230 , main memory 240 and memory bus 250 . Note that the CPU and cache memory 210 and memory controller 220 may be on the same chip. The main memory 240 may be Dynamic Random Access Memory (DRAM), Synchronous DRAM (SDRAM), SDRAM / Double Data Rate (SDRAM / DDR), Rambus DRAM (RDRAM), or the like. In one embodiment of the invention, memory controller 220 includes an intelligent automatic precharge process 215 ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A device is presented including a memory controller. The memory controller is connected to a read request queue. A command queue is coupled to the memory controller. A memory page table is connected to the memory controller. The memory page table has many page table entries. A memory page history table is connected to the memory controller. The memory history table has many page history table entries. A pre-calculated lookup table is connected to the memory controller. The memory controller includes a memory scheduling process to reduce memory access latency.

Description

technical field [0001] The present invention relates to memory controllers, and more particularly, to methods and apparatus for reducing memory access latency and increasing bandwidth. Background technique [0002] Modern memories, such as dynamic random access memory (DRAM), are used throughout the computer industry. Memory is organized into pages. Each memory page contains data in a number of contiguous memory cells. Memory devices such as DRAM are further organized into a small number of banks (eg, 4 banks) for each DRAM device. Each of these devices has many pages for each memory block. Only a single page can be accessed from a memory block at a time. Before a page in a memory block can be accessed, the page must be opened using the "activate" command. This activation command is also called a "row" command. A memory request that requires an open page is called a page-empty access request. A memory request for a page that has already been opened is called a page-hi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F12/10G06F13/16G06F12/02
CPCG06F12/0215G06F13/1631G06F13/161G06F13/00
Inventor 兰迪·奥斯本纳吉·阿布伦宁瓦姆西·马达瓦拉普罗摩克里希纳·哈加德黑里迈克尔·克罗克
Owner INTEL CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products