Augmented reality visualization method based on depth camera and application

A depth camera and augmented reality technology, applied in image enhancement, image data processing, 3D modeling, etc., can solve the problem that the calculation amount is difficult to ensure the system speed and accuracy, the system requires high real-time performance and robustness, and feature point interference and other problems, to achieve the effect of enhancing the visualization effect, reducing the amount of calculation, and reducing the collection error

Active Publication Date: 2021-01-22
HEBEI UNIV OF TECH
View PDF9 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, in some specific operation scenarios, due to the complex environment background, the operation target object is moving. To accurately locate and track the moving target object, this requires high real-time and robustness of the system. Some tracking registration methods are difficult to meet these needs
For example, the tracking and registration method based on natural feature points will cause error feature point interference due to the lack of texture features, and the accuracy is not high; the method based on markers needs to place markers in the real scene in advance, and when extreme angles or markers are blocked It will lead to registration failure; the model-based method needs to calculate the views from different angles, and the huge amount of calculation is difficult to guarantee the speed and accuracy of the system. Therefore, this application proposes a new type of positioning and tracking that can track moving targets and meet application requirements. way, and the visualization method of this application can accurately display the virtual information in the action scene, so that the information blocked by the moving object can be visualized, which has very important practical significance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Augmented reality visualization method based on depth camera and application
  • Augmented reality visualization method based on depth camera and application
  • Augmented reality visualization method based on depth camera and application

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The present invention will be further described below in conjunction with the embodiments and accompanying drawings, but it is not intended to limit the protection scope of the present application.

[0029] This embodiment takes the installation scene of SE-type dry-hanging stone curtain wall as an example for illustration. Due to the complex environment of the curtain wall installation site, the uncertainty of the background, and the moving target object, the visualization effect of this method is more prominent. The mobile chassis and UR5 robotic arm are used to absorb and install the stone. During the installation process, the workers cannot obtain enough visual information such as the pendant on the back of the stone to be installed and the keel of the curtain wall. They can only rely on personal experience and position estimation to complete a series of construction. Due to the inaccurate installation of the operation task, the installation time is longer and the co...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to an augmented reality visualization method based on a depth camera and an application. The method comprises the specific steps of 1, reconstructing background point cloud through the depth camera, and obtaining target object point cloud through three-dimensional modeling; 2, reconstructing an environment point cloud based on a depth camera; 3, key feature points of the target object point cloud and the filtered environment point cloud being extracted, point cloud registration being carried out, the pose of the target object under a depth camera coordinate system being obtained, and recognition and initial positioning of the target object being completed; 4, performing three-dimensional tracking registration based on RGBD feature points; 5, selecting a key frame according to the relative pose variation of the target object of the current key frame and the next frame of image for pose optimization of the target object, and completing updating of the target objectand the target area; and 6, realizing visualization of the occlusion features. According to the method, the target object can be well recognized in a complex scene, and the real-time performance, robustness and tracking accuracy of the system are improved.

Description

technical field [0001] The invention relates to technical fields such as computer vision, image acquisition and processing, and specifically relates to a depth camera-based augmented reality visualization method and application. Background technique [0002] Augmented reality technology is a branch of virtual reality technology. It uses computer-generated objects or information to integrate with the real environment. Awareness of the environment. [0003] Tracking registration technology, as the core of augmented reality technology, has been greatly developed. However, in some specific operation scenarios, due to the complex environment background, the operation target object is moving. To accurately locate and track the moving target object, this requires high real-time and robustness of the system. Some tracking registration methods are difficult to meet these needs. For example, the tracking and registration method based on natural feature points will cause error featu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06T19/00G06T17/00G06T5/00G06T7/90G06T7/50
CPCG06T19/006G06T17/00G06T5/002G06T7/90G06T7/50G06T2207/10028G06T2207/20028G06T2207/10016G06T2207/30132
Inventor 刘今越孙晨昭刘子毅李铁军贾晓辉
Owner HEBEI UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products