Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Depth estimation method based on edge pixel features

A technology of depth estimation and pixels, which is applied in the field of communication, can solve the problems of affecting accuracy, reducing the quality of virtual view, and large parallax difference, etc., achieving the effect of less noise points and improving subjective quality and objective quality

Inactive Publication Date: 2010-09-22
XIDIAN UNIV
View PDF3 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0020] The existing depth estimation method is based on the fact that the disparity of points on the same edge should change gently; at the same time, the method points out that the disparity difference between pixels on the edge of the image and pixels outside the edge of the image is relatively large
In practice, if the edge pixel and its adjacent pixels belong to the same depth object, the disparity value of the two should change smoothly; if the two do not belong to the same depth object, the disparity difference may be large
This method indiscriminately considers that the parallax difference between the edge pixel of the image and the adjacent pixel is large, and the wrong parallax estimation result will be generated at the edge of the image, thereby affecting the accuracy of the depth value at the edge of the image, making Composite virtual view quality reduced

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Depth estimation method based on edge pixel features
  • Depth estimation method based on edge pixel features
  • Depth estimation method based on edge pixel features

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0057] refer to image 3 , the depth estimation method of the present invention, comprises the following steps:

[0058] Step 1, classify the pixels in the image according to the specific position of the current pixel.

[0059] 1A) If the current pixel is located on the edge of the object in the image, then the current pixel is divided into the first type of pixels, i.e. the pixels on the edge;

[0060] 1B) If the current pixel is next to the edge of the object in the image, and there are pixels located on the edge of the object in its surrounding adjacent pixels, then the current pixel is assigned to the second type of pixel, i.e., the pixel next to the edge;

[0061] 1C) If the current pixel is on the edge of the object in the image, and its surrounding adjacent pixels are all on the edge of the object in the image, then classify the current pixel as the third type of pixel, ie non-edge pixel.

[0062] Step 2, according to the depth features of the object edge pixels in th...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a depth estimation method based on edge pixel features, mainly solving the problem of inaccurate depth estimation of image edges in the traditional depth estimation method of an FTV system. The depth estimation method has the scheme that firstly, pixel in an image is divided into three classes of an edge upper pixel, an edge adjacent pixel and non-edge pixel according to the specific position of the current pixel; secondly, a parallax non-uniformity function corresponding to each class of the pixel is respectively designed according to the depth features of an object edge pixel in the image; then luminance non-uniformity and parallax non-uniformity of the pixel are respectively calculated according to a luminance non-uniformity function and the obtained three classes of the parallax non-uniformity function, and corresponding parallax estimation is carried out by using an energy minimization function; finally, a parallax value is converted to a corresponding depth value according to the estimated parallax value by using a parallax depth conversion function to complete the depth estimation. The invention efficiently improves the depth estimation accuracy of the object edge pixel, and can efficiently ensure the subjective quality and the objective quality of a virtual view integrated by the receiving terminal of the FTV system.

Description

technical field [0001] The invention belongs to the communication field, and relates to depth estimation technology in three-dimensional stereoscopic video, specifically a depth estimation method capable of obtaining a high-precision depth map, so that the synthetic virtual view has fewer noise points formed at the edge part, and is effective The subjective quality and objective quality of the synthesized virtual view are improved, and it can be applied to any viewpoint television system. Background technique [0002] In traditional TV systems, users can only watch a limited viewing angle in the three-dimensional world, and the viewing point and viewing angle are determined by the three-dimensional space position and direction of the camera. Therefore, the user cannot freely select the viewpoint and viewing angle for viewing. Any point of view TV FTV system allows users to watch the real three-dimensional space from different perspectives, thus providing a new, more vivid a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/00H04N13/00
Inventor 刘晓仙常义林冯妮娜李志斌
Owner XIDIAN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products