Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

An Objective Evaluation Method of Stereo Image Quality Based on Sparse Representation of Structural Texture

A technology for objective quality evaluation and stereoscopic images, which is applied in image enhancement, image analysis, image data processing, etc., and can solve problems such as high computational complexity and unsuitable applications

Inactive Publication Date: 2017-09-19
江苏追梦信息科技有限公司
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, the existing method is to predict the evaluation model through machine learning, but its computational complexity is high, and the training model needs to predict the subjective evaluation value of each evaluation image, which is not suitable for practical applications and has certain limitations.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • An Objective Evaluation Method of Stereo Image Quality Based on Sparse Representation of Structural Texture
  • An Objective Evaluation Method of Stereo Image Quality Based on Sparse Representation of Structural Texture
  • An Objective Evaluation Method of Stereo Image Quality Based on Sparse Representation of Structural Texture

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0061] The present invention will be described in further detail below in conjunction with the embodiments of the drawings.

[0062] The present invention proposes a method for objective evaluation of stereo image quality based on sparse representation of structural texture, and its overall implementation block diagram is as follows figure 1 As shown, it includes two processes, a training phase and a testing phase. The specific steps of the training phase process are as follows:

[0063] ①-1. The left-viewpoint images of the selected N original undistorted stereo images form the training image set, denoted as {L i,org |1≤i≤N}, where N≥1, L i,org Represents {L i,org The i-th image in |1≤i≤N} represents the left-view image of the i-th original undistorted stereoscopic image, the symbol "{}" is a set of symbols, and the width of the original undistorted stereoscopic image is W , The height of the original distortion-free stereo image is H.

[0064] In specific implementation, the numbe...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention discloses a structure texture sparse representation based objective assessment method for stereoscopic image quality. The method comprises: in a training phase, performing separation of structure and texture on left view point images of a plurality of distortion stereoscopic images, performing dictionary training on a collection constructed by sub blocks of all structural images through an unsupervised learning mode to obtain a structure dictionary table, and performing dictionary training on a collection constructed by sub blocks of all texture images to obtain a texture dictionary table. In a test phase, according to the structure dictionary table and the texture dictionary table, performing joint optimization to obtain a structure sparse coefficient matrix and a texture sparse coefficient matrix, and calculating an image quality object evaluation value of the distortion stereoscopic images, so that an excellent consistency is maintained with a subject evaluation value, the structure dictionary table and the texture dictionary table do not need to be calculated again in the testing phase, calculating complexity is reduced, and the subjective evaluation value is not needed to be foreseen, and thus the method is applicable to practical application occasions.

Description

Technical field [0001] The invention relates to an image quality evaluation method, in particular to an objective evaluation method of stereo image quality based on sparse representation of structural texture. Background technique [0002] With the rapid development of image coding technology and stereo display technology, stereo image technology has received more and more attention and applications, and has become a current research hotspot. Stereoscopic image technology uses the principle of binocular parallax of the human eye. Each of the binoculars independently receives the left and right viewpoint images from the same scene, and forms binocular parallax through the brain fusion, thereby enjoying the stereoscopic image with a sense of depth and realism . Due to the influence of the acquisition system, storage compression and transmission equipment, stereoscopic images will inevitably introduce a series of distortions. Compared with single-channel images, stereoscopic images...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/00
CPCG06T2207/20081G06T2207/30168
Inventor 邵枫李柯蒙李福翠
Owner 江苏追梦信息科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products