Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Reference data access management method and device

A reference data and access management technology, applied in the field of video codec, can solve the problems of not using H.264 macroblock parallel encoder, difficult design and verification, and complicated design, so as to reduce on-chip RAM overhead and low hardware overhead , Eliminate the effect of buffer block storage

Active Publication Date: 2014-04-16
SHANGHAI FULLHAN MICROELECTRONICS
View PDF6 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0010] Treating the local buffer inside the first-level data access control device, such as sliding window or cache, as distributed RAM and using NUMA (non-uniform memory access, non-uniform memory access) architecture management helps to reduce the RAM overhead problem, however Common cache coherence protocols such as MOESI are too complex for the H.264 encoding core and difficult to design and verify
And the data management inside the data access control module of each coding core needs to consider other coding cores, which makes the design extraordinarily complicated
No H.264 macroblock parallel encoders using this design have been reported so far

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Reference data access management method and device
  • Reference data access management method and device
  • Reference data access management method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] The present invention will be further described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0038] see first Figure 4 As shown, the present invention is composed of multiple first-level cache units 101 (L1cache) exclusive to each encoding core 100, a second-level cache unit 102 (L2cache) shared by all encoding cores, and a bus 103 connecting the two.

[0039] The L1 cache unit 101 is a first multi-port 2D cache. The L1 cache unit 101 is connected to the L2 cache unit 102 through the bus 103 , and is connected to the motion estimation module of the corresponding encoding core 100 . It directly provides reference data for the motion estimation module of the encoding core 100, and requests reference data from the outside in the form of caching two-dimensional image blocks with a specified buffer block size. Therefore, the L1 cache unit 101 converts the access to the reference data from inside the encoder into an external ima...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a reference data access management method and device suitable for H.264 / AVC parallel encoding devices. The reference data access management device comprises a plurality of first-stage buffering units respectively connected with a coding core, the first-stage buffering units are connected to a second-stage buffering unit through a bus, and the second-stage buffering unit is further connected with an external storage device. Only when data accessed by the coding core does not exist in the first-stage buffering units and the second-stage buffering unit, the data are required to be obtained from the external storage device. According to the reference data access management method and device, the relativity of macro block multi-core parallel H.264 encoder to visit reference data can be sufficiently used, and the access amount of the external storage device to the reference data is sufficiently reduced. Repeated buffering block storage in each buffer memory is eliminated as much as possible, and the hardware cost of the reference data access management device is ultra low.

Description

technical field [0001] The present invention relates to the field of video coding and decoding, in particular to a reference data management device suitable for H.264 / AVC parallel coding devices, which is suitable for reference data access management of parallel coding with a macro block row (Marco-Block Row) as the basic unit Methods and devices. Background technique [0002] The H.264 / AVC standard has been accepted by the industry for its excellent image compression performance. However, as the image format to be encoded increases from CIF to D1, 720P, 1080P and even 4kx2k (Ultra HD), the H.264 encoding has a higher The amount of calculation has become a major problem in some fields with high requirements for low power consumption. The method used by the industry to reduce H.264 encoding power consumption in the algorithm implementation stage is to reduce the hardware / processor clock frequency during the encoding process through parallel processing, and then reduce the ha...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30
CPCG06F12/0877
Inventor 诸悦高厚新陈晓春章旭东刘斌刘翔陈子遇
Owner SHANGHAI FULLHAN MICROELECTRONICS
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products