Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method of signalling motion information for efficient scalable video compression

a technology of motion information and video compression, applied in the direction of color television with bandwidth reduction, television systems, instruments, etc., can solve the problems of ineffective video browsing solutions, limited compression bit-rates which are lower than the capacity of relevant network connections, and still disappointing video quality and duration

Inactive Publication Date: 2006-10-26
UNISEARCH LTD
View PDF8 Cites 59 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

[0022] Thus, because each motion field is represented in coarse to fine fashion and interleaved with the video data bit-stream, the accuracy required for motion representation can be balanced with the accuracy of the transformed sample values which may be recovered from the bit-stream. Therefore, a fully scalable video bit-stream may be progressively refined, both in regard to its quantised sample representations and in regard to its motion representation.
[0048] This further reduces the motion information to 1 motion field per video frame, even for the 5 / 3 transform. The method of this embodiment has the property that the motion representation is temporally scalable. In particular, only one motion field must be made available to the decoder for each video frame which it can reconstruct, at any selected temporal resolution. This method involves judicious compositing of the forward and backward motion fields from different temporal resolution levels and is compatible with the efficient motion estimation method described above, of compositing motion fields at higher resolutions to obtain motion fields at lower resolutions.

Problems solved by technology

Currently, most video content which is available over the internet must be pre-loaded in a process which can take many minutes over typical modem connections, after which the video quality and duration can still be quite disappointing.
In some contexts video streaming is possible, where the video is decompressed and rendered in real-time as it is being received; however, this is limited to compressed bit-rates which are lower than the capacity of the relevant network connections.
Approaches of this type, however, do not represent effective solutions to the video browsing problem.
Then storage must be found on the video server for all these separate compressed bit-streams, which is clearly wasteful.
More importantly, if the quality associated with a low bit-rate version of the video is found to be insufficient, a complete new version must be downloaded at a higher bit-rate; this new bit-stream must take longer to download, which generally rules out any possibility of video streaming.
More importantly, if the quality associated with a low bit-rate version of the video is found to be insufficient, only the incremental contribution required to achieve the next higher level of quality must be retrieved from the server.
In practice, this means that most of the available bits will be devoted to a low bit-rate portion of the video are likely to contribute to the reconstruction of the video at a reduced frame rate, since attempting to recover the full frame rate video over a low bit-rate channel will result in unacceptable deterioration of the spatial details within each frame.
The cost of estimating, coding and transmitting the above motion fields can be substantial.
Moreover, this cost may adversely affect the scalability of the entire compression scheme, since it is not immediately clear how to progressively refine the motion fields without destroying the subjective properties of the reconstructed video when the motion is represented with reduced accuracy.
However, no particular solution is presented to the difficulties described above.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method of signalling motion information for efficient scalable video compression
  • Method of signalling motion information for efficient scalable video compression
  • Method of signalling motion information for efficient scalable video compression

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

1st Aspect: Reciprocal Motion Fields

[0059] A natural strategy for estimating the reciprocal motion fields, W2k→2k+1 and W2k+1→2k, would be to determine the parameters for W2k→2k+1 which minimise some measure (e.g., energy) of the mapping residual x2k+1−W2k→2k+1(x2k) and to separately determine the parameters for W2k+1→2k which minimise some measure of its residual signal, x2k−W2k+1→2k(x2k+1). In general, such a procedure will lead to parameters for W2k→2k+1, which cannot be deduced from those for W2k+1→k2 and vice-versa, so that both sets of parameters must be sent to the decoder.

[0060] It turns out that only one of the two motion fields must be directly estimated. The other can then be deduced by “inverting” the motion field which was actually estimated. Both the compressor and the decompressor may perform this inversion so that only one motion field must actually be transmitted.

[0061] True scene motion fields cannot generally be inverted, due to the presence of occlusions and ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A method for incrementally coding and signalling motion information for a video compression system involving a motion adaptive transform and embedded coding of transformed video samples comprises the steps of: (a) producing an embedded bit-stream, representing each motion field in coarse to fine fashion; and (b) interleaving incremental contributions from said embedded motion fields with incremental contributions from said transformed video samples. A further embodiment of a method for estimating and signalling motion information for a motion adaptive transform based on temporal lifting steps comprises the steps of: (a) estimating and signalling motion parameters describing a first mapping from a source frame onto a target frame within one of the lifting steps; and (b) inferring a second mapping between either said source frame or said target frame, and another frame, based on the estimated and signalled motion parameters associated with said first mapping.

Description

FIELD OF THE INVENTION [0001] The present invention relates to efficient compression of motion video sequences and, in preferred embodiments, to a method for producing a fully scalable compressed representation of the original video sequence while exploiting motion and other spatio-temporal redundancies in the source material. The invention relates specifically to the representation and signalling of motion information within a scalable compression framework which employs motion adaptive wavelet lifting steps. Additionally, the present invention relates to the estimation of motion parameters for scalable video compression and to the successive refinement of motion information by temporal resolution, spatial resolution or precision of the parameters. BACKGROUND OF THE INVENTION [0002] For the purpose of the present discussion, the term “internet” will be used both in its familiar sense and also in its generic sense to identify a network connection over any electronic communications m...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N11/04H04B1/66H04N11/02H04N7/12G06T9/00H04N7/26
CPCH04N19/52H04N19/147H04N19/46H04N19/122H04N19/54H04N19/62H04N19/635H04N19/31H04N19/19H04N19/184
Inventor TAUBMAN, DAVIDSECKER, ANDREW
Owner UNISEARCH LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products