Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Computer micro system structure comprising explicit high-speed buffer storage

A high-speed cache and architecture technology, applied in the field of computer systems, can solve the problems of invisible compilers, poor program locality, memory access misses, etc., and achieve the effect of easy identification and implementation, and fast access.

Active Publication Date: 2004-09-15
INST OF COMPUTING TECH CHINESE ACAD OF SCI
View PDF0 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Memory access misses the Cache is the main cause of delay;
[0012] 3) The Cache structure is invisible to the compiler and knows its existence, but it cannot purposefully allocate data with good reusability to the Cache
Taking the program in Figure 8 as an example, when m is large and the m elements of b: b(1), b(2), ...b(m) cannot all be loaded in the Cache, the locality of the program extremely bad

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Computer micro system structure comprising explicit high-speed buffer storage
  • Computer micro system structure comprising explicit high-speed buffer storage
  • Computer micro system structure comprising explicit high-speed buffer storage

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0031] Describe the present invention below in conjunction with accompanying drawing. As shown in Fig. 2, the present invention adds an Ecache in the CPU chip, and the Ecache is essentially a high-speed memory in a chip. The connection mode of Ecache is the same as the cache in the existing computer system, one end is connected to the register, and the other end is connected to the memory in the computer system. Moreover, the data transmission between Ecache and memory and registers shares the existing mechanism of cache.

[0032] FIG. 4 is a schematic diagram of unified addressing and address space division of Ecache and memory. It can be seen from Figure 4 that the Ecache located in the CPU chip and the memory located outside the CPU chip are uniformly addressed, and start from a small address. In the implementation of accessing Ecache, it is first judged whether the access address is < m. If < m, it means direct access to Ecache, and the compiler ensures that the data to...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The architecture comprises memory, cache, registers and arithmetical unit, as well as comprises Ecache inside CPU chip. The said Ecache and memory are coded unitedly. Since Ecache is inside CPU. Thus, it is guaranteed that hardware accesses Ecache rapidly. United coded addresses are started from small address. Thus, in all command of accessing memories, addresses of accessing Ecache is visible, and is easy for hardware to recognize and realize. Several groups of commands are designed to support compiler and running routine to use Ecache in explicit and manage Ecache dynamically. These commands and Ecache are indiscerptible integration.

Description

technical field [0001] The present invention relates to computer systems, and in particular to computer microarchitectures containing explicit cache memory (abbreviated as Ecache). technical background [0002] In the past 50 years, computer performance has generally increased according to Moore's law, mainly relying on the technology of increasing the operating frequency of the machine and using various parallel mechanisms. Although storage technology has also developed, there is still a large gap between memory speed and processor speed. Modern computers have a level-1, level-2, or even level-3 cache memory (Cache) between memory and registers. It is hoped that the data in the cache can be reused to alleviate the contradiction of slow memory access (see Figure 1). [0003] Why can adding Cache in the chip alleviate memory access conflicts? Let's take the process of CPU reading data as an example to briefly explain the working process of Cache (see Figure 3 for the workin...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F9/30G06F12/02
Inventor 张兆庆乔如良唐志敏冯晓兵
Owner INST OF COMPUTING TECH CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products