Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-viewpoint attitude estimating and self-calibrating method for three-dimensional active vision sensor

A technology of active vision and attitude estimation, which is applied in instrumentation, computing, image analysis, etc. to improve the calibration efficiency.

Active Publication Date: 2007-01-03
SHENZHEN ESUN DISPLAY
View PDF2 Cites 34 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For 3D active vision sensors based on phase mapping, the calibration methods are mostly static manual operations, and the existing camera self-calibration methods, such as Zhang (IEEE Transactions on Robotics and Automation, Vol.12, No.1, 1996, pp103 -113), and cannot be directly applied to 3D active vision sensors

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-viewpoint attitude estimating and self-calibrating method for three-dimensional active vision sensor
  • Multi-viewpoint attitude estimating and self-calibrating method for three-dimensional active vision sensor
  • Multi-viewpoint attitude estimating and self-calibrating method for three-dimensional active vision sensor

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0072] The structure of the actually designed 3D vision sensor is as follows: Figure 1 Show. 101 is a digital projector, and 103 is a video camera. 102 is the exit pupil P of the projection mirror of the digital projector 101 , and 104 is the entrance pupil E of the imaging lens of the camera 103 . The adjusting rod 105 is used to adjust the height and angle of the camera 103, and the 106 is a computer.

[0073] According to the steps described above, the real object (such as Figure 7) to collect. Solve the attitude position of the viewpoint and calibrate the sensor parameters at the same time.

[0074] The calibration result is:

[0075] (1) Internal parameters of the camera: K c = 3564.36 - 5.99209 252.127 0 ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A multi-viewpoints gesture estimate and self-demarcate method for three-dimensional initiative vision sensor belongs to three-dimensional figure imaging and sculpting technology. The said sensor is made up with numeric projector and camera. Collect object vein picture; project a set of orthogonal list map to object early or late and collect corresponding coding list picture; count the two-dimensional coordinate of characteristic point to vein picture and phase value to coding list picture. Use transforming arithmetic from phase position to coordinate to seek the corresponding relationship of diagnostic point of object between the project plane to numeric projector and the imaging plane to camera. Change viewpoint and repeat the said steps by using polar ray geometry restriction element to base optimization equation to automatically estimate multi-viewpoint location gesture and self-demarcate three-dimensional initiative vision sensor. It is veracious, automatic and special to locale multi-viewpoints gesture estimate and self-demarcate of vision sensor.

Description

technical field [0001] The invention relates to a multi-viewpoint posture estimation and self-calibration method of a three-dimensional active vision sensor, which belongs to three-dimensional digital imaging and modeling technology. Background technique [0002] 3D Digital Imaging and Modeling (3DIM-3D Digital Imaging and Modeling) is an emerging interdisciplinary field that has been actively researched internationally in recent years. It is widely used in reverse engineering, cultural relics protection, medical diagnosis, industrial detection and virtual reality and many other aspects. As one of the main means of obtaining 3D information, the 3D active vision sensor based on phase mapping has the advantages of fast speed, high resolution, non-contact, and full-field data acquisition, and has attracted extensive attention and research. It is an important task to solve the motion direction of the sensor and estimate the position and attitude of the sensor at different viewp...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01B11/00G01B21/00G06T7/00
Inventor 彭翔丁雅斌田劲东
Owner SHENZHEN ESUN DISPLAY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products