Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for partial reference evaluation of wireless videos based on space-time domain feature extraction

A feature extraction and wireless video technology, applied in the field of video quality evaluation, can solve problems such as large amount of data, difficulty in building models, and incomplete HVS research

Inactive Publication Date: 2010-06-16
XIAMEN UNIV +1
View PDF0 Cites 43 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] (1) The currently widely used algorithm is based on the mean square signal-to-noise ratio, and the representative methods MSE (Mean Squared Error), PSNR (peak-signal-to-noise ratio), etc., are based on the error between corresponding pixels , that is, image quality evaluation based on error sensitivity, ignoring the special properties of the human visual system, may have deviations between the quality evaluation of video image content and human subjective evaluation;
[0006] (2) Based on the need to transmit complete material to be tested in the wireless transmission of the full reference model, the amount of data required for evaluation is large;
[0007] (3) The evaluation delay is large, the occupied bandwidth is large, and it is not suitable for real-time evaluation of wireless applications;
[0012] (3) Some references The current international research is not mature enough, and people's research on HVS is not yet thorough, and it is still difficult to establish an accurate and unified model so that some references can maintain the same effect as subjective evaluations

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for partial reference evaluation of wireless videos based on space-time domain feature extraction
  • Method for partial reference evaluation of wireless videos based on space-time domain feature extraction
  • Method for partial reference evaluation of wireless videos based on space-time domain feature extraction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0100] The present invention is mainly applied to 3G wireless video services such as videophone, and digital broadcasting and television services. The system mainly includes three parts: the sending end, the wireless channel and the receiving end.

[0101] The sending device needs to be equipped with a memory, a real-time processing chip, a camera and a wireless transmitting module. In practical applications, a typical device at the sending end may be a mobile terminal, and the video comes from a camera file of the mobile phone. The sending end needs to establish two independent channels. The main channel requires a wide bandwidth and a high rate for real-time transmission of video data streams; the secondary channel or auxiliary channel is mainly used to transmit the extracted video features. Parameters, these parameters require a small amount of data and are very representative, so the auxiliary channel has lower requirements on bandwidth and rate. These parameters mainly ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for partial reference evaluation of wireless videos based on space-time domain feature extraction, which relates to a method for evaluating video quality. On the basis of performing profound understanding and detailed analysis on the existing video objective quality evaluation model, by combining visual characteristics of human eyes, the invention provides the method for the partial reference evaluation of the wireless videos based on the space-time domain feature extraction to improve the conventional SSIM model. The method takes the fluency of the time domain and the structural similarity and the definition of the space domain of the videos as main evaluation indexes, and under the condition of ensuring the evaluation accuracy, the method extracts characteristic parameters of ST domains (the space domain and the time domain), establishes a new evaluation model, reduces the reference data needed by the evaluation and lowers the computational complexity to ensure that the evaluation model is suitable for evaluating the quality of wireless transmission videos in real time.

Description

technical field [0001] The present invention relates to a video quality evaluation method, in particular to a video part reference evaluation method based on space-time domain feature extraction and improvement of the traditional SSIM algorithm, with a small amount of data, and is especially suitable for video quality evaluation in wireless video services Real-time evaluation. Background technique [0002] With the increase of mobile communication services, wireless communication has been widely used. The characteristics of the wireless communication environment (wireless channels, mobile terminals, etc.) and mobile multimedia application services are more and more support and optimization for image services and video services. With the diversification of wireless networks and the diversification and complexity of wireless environments, the signal will be lost during the video transmission process, so the intuitive quality of video images has become an important indicator t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N17/00H04N17/02H04N7/26H04N19/186
Inventor 黄联芬林佳楠陈少俊张远见姚彦冯超李进锦
Owner XIAMEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products