Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

High-performance instruction cache system and method

An instruction cache, high-performance technology, applied in the computer field, can solve a variety of problems such as cache misses, and achieve the effects of avoiding capacity misses, improving running speed, and reducing power consumption

Active Publication Date: 2015-03-18
SHANGHAI XINHAO MICROELECTRONICS
View PDF4 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, with the widening processor / memory speed gap, the current architecture, especially multiple cache misses, has become the most serious bottleneck restricting the performance improvement of modern processors

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • High-performance instruction cache system and method
  • High-performance instruction cache system and method
  • High-performance instruction cache system and method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The high-performance caching system and method proposed by the present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments. Advantages and features of the present invention will be apparent from the following description and claims. It should be noted that all the drawings are in a very simplified form and use imprecise scales, and are only used to facilitate and clearly assist the purpose of illustrating the embodiments of the present invention.

[0040] It should be noted that, in order to clearly illustrate the content of the present invention, the present invention specifically cites multiple embodiments to further explain different implementation modes of the present invention, wherein the multiple embodiments are enumerated rather than exhaustive. In addition, for the sake of brevity of description, the content mentioned in the previous embodiment is often omitted in the latter embodiment, ther...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A system and a method for caching a high-performance instruction are applied to the field of processors, which can fill, before a processor core executes an instruction, the instruction in a high-speed memory that can be directly accessed by the processor core, so that the processor core can acquire a needed instruction from the high-speed memory almost each time, thereby achieving a high cache hit rate.

Description

technical field [0001] The invention relates to the fields of computer, communication and integrated circuit. Background technique [0002] Usually, the function of the cache is to copy part of the content in the lower-level memory, so that the content can be quickly accessed by the higher-level memory or the processor core, so as to ensure the continuous operation of the pipeline. [0003] The addressing of the current cache is based on the following method, use the index segment in the address to address the tag in the read tag memory and match the tag segment in the address; use the index segment in the address and the displacement segment in the block to jointly address the read cache in the content. If the tag read from the tag memory is the same as the tag segment in the address, then the content read from the cache is valid, which is called a cache hit. Otherwise, if the tag read from the tag memory is not the same as the tag segment in the address, it is called a c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F12/08G06F12/0864G06F12/0875G06F12/0897
CPCG06F12/08G06F12/0864G06F12/0875G06F12/0897G06F2212/452
Inventor 林正浩
Owner SHANGHAI XINHAO MICROELECTRONICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products