Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for calculating 3D-JND threshold value

A 3D-JND, calculation method technology, applied in the direction of image communication, electrical components, stereo systems, etc., can solve the problem of not considering binocular stereo vision, not considering the impact, etc.

Inactive Publication Date: 2015-07-01
TONGJI UNIV
View PDF4 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this model only uses the depth information of the stereo image to establish a saliency model, and does not consider the influence of two-dimensional features such as color on the stereo saliency modeling.
Moreover, the model does not take into account another important feature in binocular stereo vision, that is, the binocular suppression effect, so the accuracy of the model needs to be further improved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for calculating 3D-JND threshold value
  • Method for calculating 3D-JND threshold value
  • Method for calculating 3D-JND threshold value

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] The present invention will be further described below in conjunction with the embodiments shown in the drawings.

[0042] The example provided by the present invention uses MATLAB7 as the simulation experiment platform, and selects the view1 and view3 perspectives of Art (size: 695*555) in the Middlebury3D image library as the binocular stereo test images (such as figure 2 Shown), the following describes this example in detail in conjunction with each step:

[0043] Step (1): Select the left and right color images as test images, and use the NAMM model to obtain the 2D-JND basic thresholds of the left and right views. Basic threshold for left and right views It can be calculated according to the following formula:

[0044] SJND Y ( x , y ) = T l ( x , y ) + T Y t ( x , y ) - C Y lt · min { T l ( x , y ) , T Y t ( x , y ) } - - - ( 1 )

[0045...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for calculating 3D-JND threshold value. The method performs the 3D-JND modeling by simultaneously considering about a multilevel three-dimensional selection notice mechanism from far and near, and from thick to fine in the course of watching the three-dimensional scale by a humen vision system, and a binocular inhibiting effect of the binocular three-dimensional vision. For simultaneously considering the multilevel three-dimensional selection notice mechanism and the binocular inhibiting effect of the binocular three-dimensional vision, and the 3D-JND modeling is carried out, the method is more suitable for the human eye three-dimensional visual system by comparing with the traditional 3D-JND model. The calculation method proposed by the invention can hold more noises, and has better visual quality. The model can be applied to sense and code 3D image / video, so as to remove visual redundancy in the image / video.

Description

Technical field [0001] The invention belongs to the technical field of stereo image / video coding, and relates to stereo perception distortion modeling technology. Background technique [0002] The traditional image / video coding technology mainly aims at removing the spatial and temporal statistical redundancy for compression coding, and less considers the visual redundancy in the image / video. In order to further improve the coding efficiency, researchers have begun to work on removing visual redundancy. At present, an effective method to characterize visual redundancy is Just noticeable difference (JND). The traditional two-dimensional JND model (2D-JND) mainly considers the visual characteristics of brightness masking effect, contrast masking effect, multi-channel decomposition mechanism, etc., such as the NAMM model in Literature 1 (X.Yang, W.Lin, Z.Lu, EPOng) , and S. Yao, "Just-noticeable-distortion profile with nonlinear additivity model for perceptual masking color images...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): H04N13/00
Inventor 张冬冬陈勇
Owner TONGJI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products