Visual tracking and positioning method based on dense point cloud and composite view

A technology of visual tracking and positioning method, which is applied in the information field to achieve the effect of fast initial positioning

Active Publication Date: 2020-02-28
BEIJING INSTITUTE OF TECHNOLOGYGY
View PDF2 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to overcome the defects of the prior art, in order to solve the problem of the association between laser radar three-dimensional scanning data and image data, so that it can be used for visual tracking and positioning, and propose a visual tracking based on dense point cloud and synthetic view positioning method

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual tracking and positioning method based on dense point cloud and composite view
  • Visual tracking and positioning method based on dense point cloud and composite view
  • Visual tracking and positioning method based on dense point cloud and composite view

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention more clear, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention.

[0025] The design idea of ​​the present invention is: transform the three-dimensional point cloud obtained by the laser radar through space back projection to generate a synthetic image under a known viewpoint, and use the method of matching the synthetic image with the real-time image acquired by the camera to estimate the real-time 6 freedom of the camera. degree pose. This method can be used for tracking navigation or auxiliary positioning in the fields of robots, unmanned vehicles, unmanned aerial vehicles, virtual reality and augmented reality.

[0026] The present invention is based on the visual tracking and positioning method of dense point cloud and s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a visual tracking and positioning method based on dense point cloud and a composite view, and the method comprises the steps: carrying out the three-dimensional scanning of a real scene, obtaining a color key frame image and a corresponding depth image, carrying out the image restoration, and carrying out the image coding of the key frame image; performing image coding on acurrent frame image acquired by the camera in real time, and selecting a composite image closest to the current frame image as a reference frame image of the current image; obtaining stable matching feature point sets on the two images, and performing processing to obtain six-degree-of-freedom pose information of the current frame camera relative to a three-dimensional scanning point cloud coordinate system; and performing judging by using an optical flow algorithm, if the requirement cannot be met, updating the current frame image to be the next frame image acquired by the camera, and performing re-matching. According to the invention, the association problem of the three-dimensional point cloud obtained by the laser radar and the heterogeneous visual image can be solved, and the effect of realizing visual navigation rapid initialization positioning is achieved.

Description

technical field [0001] The invention belongs to the field of information technology, and in particular relates to a visual tracking and positioning method based on dense point clouds and synthetic views. Background technique [0002] Visual real-time tracking and positioning of large-scale outdoor scenes has always been an important research direction in the field of computer vision. Outdoor complex and changeable environmental factors, such as lighting, viewing angle, occlusion, weak texture, and objects moving over time, all have a greater impact on the accuracy and robustness of the visual tracking and positioning algorithm. Outdoor combat scenes such as deserts, grasslands, and mountains have a large area and the scene texture information is not rich, which poses higher technical challenges to the visual tracking and positioning algorithm. At present, the commonly used outdoor large scene visual tracking and positioning algorithm is the SLAM (real-time positioning and m...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/246G06T7/269G06T7/73G01S17/02G01S17/66G01S17/89
CPCG01S17/66G01S17/89G06T2207/10016G06T2207/10024G06T2207/10028G06T7/246G06T7/269G06T7/73
Inventor 陈靖缪远东
Owner BEIJING INSTITUTE OF TECHNOLOGYGY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products