Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

High speed cache data pre-fetching method based on increment type closed sequence dredging

A high-speed cache and data prefetching technology, which is applied in the direction of electrical digital data processing, special data processing applications, memory systems, etc., can solve problems such as inability to mine, small change range, high frequency, etc., to achieve improved hit rate, small change range, The effect of high frequency

Inactive Publication Date: 2008-09-17
ZHEJIANG UNIV
View PDF0 Cites 18 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The CMiner mining method is for static databases and cannot be used for mining in real-time environments. The access sequence of stored data is dynamically updated, which is characterized by small changes and high frequency. Therefore, a more efficient real-time closing sequence is needed. Column Mining Methods to Meet Requirements

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • High speed cache data pre-fetching method based on increment type closed sequence dredging
  • High speed cache data pre-fetching method based on increment type closed sequence dredging
  • High speed cache data pre-fetching method based on increment type closed sequence dredging

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] The processing flow of the present invention can be found in figure 2 shown, is applied to figure 1 In the data prefetch step in the cache data prefetch module. First, by collecting the data access sequence requested by the CPU to the memory, convert it into a sequence and input it into the database, then use the incremental closed sequence mining algorithm to mine frequent closed sequences, extract the cache data prefetch rules, and finally use it to guide the cache The data prefetching improves the cache hit ratio. Combine below figure 2 Introduce the processing flow of the present invention:

[0040] 1. Acquisition: Real-time record the sequence of logical block numbers in the file system requested by the CPU (for example, FAT32 is a cluster composed of sectors), and the logical block number of each file system is an item in the sequence; for example, real-time acquisition The CPU access memory access sequence to be {CAABCABCBDCABCABCEABBCA}.

[0041] 2. Prepr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a prefetching method of cache data based on an incremental closing sequence mining. Through collecting the data access sequence of visiting the memory by the CPU, the data access sequence is transferred into a sequence input database, mining a frequent closed sequence adopting the incremental closing sequence mining arithmetic, extracting a cache data prefetching rule to use to conduct the data prefetching of the cache, so that there are semantic link and data prefetching intelligent between the data blocks, which enhances the hit ratio of the cache. The invention can mine the inputted sequence in an incremental way according to the environmental change in real time. Maintaining mass candidate closed sequences is not needed, so that a lot of space is saved, enhanced the hit ratio of the cache by 12%-25%. The invention has universality, being easy to widely apply to the real time environment that the sequence input string database is dynamic update, the range of variation is small, and the frequency is high.

Description

technical field [0001] The invention relates to cache data prefetch technology, in particular to a cache data prefetch method based on incremental closed sequence mining. Background technique [0002] In recent years, in order to reduce the time that a processor waits to read data, a technique using a cache memory has been proposed. Relevant parts of SoCs employing cache memory such as figure 1 As shown, the system chip includes a processor, a cache, a cache prefetch module, a memory, and a system bus. Among them, the processor reads data from the cache and performs calculations and sends the control information of the prefetch data to the cache prefetch module; the cache stores the data to be used by the processor; the cache prefetch module is used for the slave system The memory reads the data to be used by the processor and transmits the data to the cache; various data are stored in the memory; the system chip bus connects the high-number cache prefetch module and the m...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F12/08G06F17/30G06F12/0862
Inventor 陈刚蔡铭李山亭
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products