Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Queue cache management method and system, storage medium, computer equipment and application

A cache management and queue technology, applied in the field of data exchange, can solve the problems of reducing the overall system speed, occupying, increasing overhead, etc.

Active Publication Date: 2020-12-15
XIDIAN UNIV
View PDF7 Cites 9 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] At present: In order to increase the exchange rate of data frames, on-chip storage resources are usually used to store data frames, but on-chip resources are very precious. Due to the large number of queues to be managed in the queue management module of the switching system, in order to avoid the data of different queues Frames will not be stored together out of order, which requires data frames of different queues (even different priorities) to be stored in different RAM areas, while on-chip storage resources are limited, and on-chip Block RAM has fixed specifications (36K , 18k), the internal storage fragmentation of instantiated multiple RAMs will be very large
A better method is to divide Block RAM resources into a relatively fixed area, such as 64 bytes (the shortest frame length of an Ethernet frame). The specific idea is that when a data frame applies for cache allocation, the queue management module will allocate it Divide into several 64-byte fragments for storage. When the length of the final fragment is less than 64 bytes, it will also occupy a complete fragment. That is, in the limit case, the unit stored in a data frame will have 63 The internal fragmentation of bytes, but if the fragmentation is divided into smaller storage areas in order to reduce internal fragmentation, this will not only increase the linked list overhead for managing these storage areas, but also make the steps of enqueuing to apply for cache areas longer, reducing The overall speed of the system
[0004] The difficulty in solving the above problems and defects is: if a queue management scheme based on fragmentation with random frame length is adopted, not only the fixed-length fragments of the entire storage area need to be connected in the form of a linked list, but additional management of these storage areas is required. The overhead of the area linked list, and the total number of fixed-length fragments occupied by the storage of a random frame is unpredictable, which leads to the need for multiple cache applications for a random frame that is about to enter the queue, which limits the exchange rate of the entire system. It requires not only complex queue management mechanism, but also complex cache management and cache query work to achieve the correct forwarding of data frames, which greatly increases the difficulty of the overall design

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Queue cache management method and system, storage medium, computer equipment and application
  • Queue cache management method and system, storage medium, computer equipment and application
  • Queue cache management method and system, storage medium, computer equipment and application

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0074] In order to make the object, technical solution and advantages of the present invention more clear, the present invention will be further described in detail below in conjunction with the examples. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

[0075] Aiming at the problems existing in the prior art, the present invention provides a queue buffer management method, system, storage medium, computer equipment and application. The present invention will be described in detail below with reference to the accompanying drawings.

[0076] Such as figure 1As shown, the queue buffer management method provided by the present invention includes the following steps:

[0077] S101: Framing the variable-length data frames to form a fixed-length frame with a fixed length of bytes, and initiate a request to the queue buffer management module to apply for joining the queue. Since ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the technical field of data exchange, and discloses a queue cache management method and system, a storage medium, computer equipment and an application. The method includes framing on data frames subjected to flow classification and grouping processing before the data frames enter a queue cache management module, and framing the lengthened Ethernet data frames into fixed-length frames with fixed bytes; using peripheral control to an on-chip Block RAM, so that a configurable multi-channel FIFO queue is formed and used for storing fixed-length frames. According to the invention, the fixed-length frames of different queues are stored in the whole Block RAM, the storage area is selected in a configurable mode to be presented outwards to form the whole RAM or a plurality of FIFO queues, a proper storage scheme is selected according to the storage area, the utilization rate of storage resources is increased, and the processing and forwarding efficiency of data framesis improved. According to the invention, internal fragments can be avoided as much as possible, the overall rate of the system is improved, and the utilization rate of storage resources is greatly improved.

Description

technical field [0001] The invention belongs to the technical field of data exchange, and in particular relates to a queue buffer management method, system, storage medium, computer equipment and application. Background technique [0002] At present: In order to increase the exchange rate of data frames, on-chip storage resources are usually used to store data frames, but on-chip resources are very precious. Due to the large number of queues to be managed in the queue management module of the switching system, in order to avoid the data of different queues Frames will not be stored together out of order, which requires data frames of different queues (even different priorities) to be stored in different RAM areas, while on-chip storage resources are limited, and on-chip Block RAM has fixed specifications (36K , 18k), the internal storage fragmentation of instantiating multiple RAMs will be very large. A better method is to divide Block RAM resources into a relatively fixed ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F15/78G06F16/901
CPCG06F15/781G06F15/7846G06F16/9024
Inventor 潘伟涛韩冰邱智亮高志凯熊子豪
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products