A 3d‑jnd threshold calculation method

A technology of 3D-JND and calculation method, which is applied in image communication, electrical components, stereo system, etc., and can solve the problems of not considering the impact and binocular stereo vision, etc.

Inactive Publication Date: 2017-05-31
TONGJI UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this model only uses the depth information of the stereo image to establish a saliency model, and does not consider the influence of two-dimensional features such as color on the stereo saliency modeling.
Moreover, the model does not take into account another important feature in binocular stereo vision, that is, the binocular suppression effect, so the accuracy of the model needs to be further improved

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A 3d‑jnd threshold calculation method
  • A 3d‑jnd threshold calculation method
  • A 3d‑jnd threshold calculation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] The present invention will be further described below in conjunction with the embodiments shown in the accompanying drawings.

[0042] The example that the present invention provides adopts MATLAB7 as the simulation experiment platform, selects view1 and view3 angles of view of Art (size: 695*555) in the Middlebury3D image storehouse as the binocular stereoscopic test image (such as figure 2 shown), the following describes this example in detail in conjunction with each step:

[0043] In step (1), the left and right color images are selected as test images, and the 2D-JND basic thresholds of the left and right views are obtained by using the NAMM model respectively. Base threshold for left and right views It can be calculated according to the following formula:

[0044]

[0045] in T l (x, y) represents the adaptive threshold of the image based on the background brightness at (x, y), that is, the maximum value of the background brightness model and the spatial...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a method for calculating 3D-JND threshold value. The method performs the 3D-JND modeling by simultaneously considering about a multilevel three-dimensional selection notice mechanism from far and near, and from thick to fine in the course of watching the three-dimensional scale by a humen vision system, and a binocular inhibiting effect of the binocular three-dimensional vision. For simultaneously considering the multilevel three-dimensional selection notice mechanism and the binocular inhibiting effect of the binocular three-dimensional vision, and the 3D-JND modeling is carried out, the method is more suitable for the human eye three-dimensional visual system by comparing with the traditional 3D-JND model. The calculation method proposed by the invention can hold more noises, and has better visual quality. The model can be applied to sense and code 3D image / video, so as to remove visual redundancy in the image / video.

Description

technical field [0001] The invention belongs to the technical field of stereo image / video coding, and relates to stereo perception distortion modeling technology. Background technique [0002] Traditional image / video coding techniques mainly focus on the removal of spatial and temporal statistical redundancy for compression coding, and less consideration is given to visual redundancy in images / videos. In order to further improve the coding efficiency, researchers have started research on removing visual redundancy. At present, an effective method to represent visual redundancy is just noticeable difference model (Justnoticeable difference, JND). The traditional two-dimensional JND model (2D-JND) mainly considers visual characteristics such as brightness masking effect, contrast masking effect, and multi-channel decomposition mechanism, such as the NAMM model in Document 1 (X.Yang, W.Lin, Z.Lu, E.P.Ong , and S. Yao, "Just-noticeable-distortion profile with nonlinear additiv...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): H04N13/00
Inventor 张冬冬陈勇
Owner TONGJI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products