Particle filter based multi-frame reference motion estimation method

A motion estimation and particle filtering technology, applied in the field of multi-frame reference motion estimation based on particle filtering, can solve the problem of insufficient search accuracy and high

Inactive Publication Date: 2011-09-14
ZHEJIANG UNIV
View PDF5 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Although the above-mentioned traditional multi-frame reference motion estimation methods have improved the search speed to some extent, these methods all have the defect that the search accuracy is not high enough.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Particle filter based multi-frame reference motion estimation method
  • Particle filter based multi-frame reference motion estimation method
  • Particle filter based multi-frame reference motion estimation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0043] Such as figure 1 As shown, the following examples illustrate the multi-frame reference motion estimation method based on particle filter of the present invention, and its steps are as follows:

[0044] (1) In the image sequence, the z coded frames nearest to the frame to be coded are used as reference frames, where 1≤z≤16.

[0045] In this embodiment, taking the fth frame to be encoded as an example, the value of z is 10. Take the f-1th frame as the current reference frame, and use the full search method in the f-1th frame to perform a motion search on one of the blocks to be coded in the f-th frame (taking the g-th block as an example), and obtain the f-th The encoding cost of all points in the search window of -1 frame, where the minimum encoding cost is C min ; Select K points with the smallest encoding cost to form a particle set, where 10≤K≤100. In this embodiment, the value of K is 10. choose C min The corresponding point is used as the current best particle ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a particle filter based multi-frame reference motion estimation method, belonging to the technical field of video compression coding. The method comprises the following steps of: searching blocks to be coded in adjacent reference frames by adopting a full search method, and selecting a plurality of points with lowest coding cost to form an initial particle set; updating the initial particle set to acquire a particle set of a next reference frame; calculating the weight of each particle in the particle set in the next reference frame, and acquiring optimal particles by estimation; performing re-sampling by using the weight of each particle, and updating the re-sampled particles to obtain a particle set of a next reference frame; circulating the weight calculation, estimation and update processes of the particles till a maximum reference frame is obtained, and coding the blocks to be coded by using the blocks corresponding to the searched particles with lowest coding cost; and traversing the image of the whole frame to complete the coding of the image of the whole frame. By adopting the particle filter method for estimating the multi-frame reference motion, the time of motion estimation is greatly reduced at the same time of ensuring the coding quality.

Description

technical field [0001] The invention belongs to the technical field of video compression coding, and relates to a particle filter-based multi-frame reference motion estimation method. Background technique [0002] Motion estimation is an important part of video compression coding technology, and its role is to effectively remove the temporal redundancy of image sequences. In the process of video compression coding, motion estimation occupies 50%-70% of the computation, so improving the efficiency of motion estimation is of great significance for speeding up video coding. [0003] The multi-reference frame motion estimation method is a new technology introduced by the H.264 standard. By selecting multiple reference frames for motion search, it can effectively avoid problems such as periodic object motion and camera shake, and make the searched matching blocks It is closer to the block to be encoded, which greatly reduces the amount of code required for encoding and increases...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): H04N7/26H04N7/46H04N7/50H04N19/132H04N19/176H04N19/51H04N19/513H04N19/57H04N19/587H04N19/59H04N19/61
Inventor 丁勇宋文华孙纲德王翔张渊叶森贾梦楠刘钧石张东严晓浪
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products