Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Augmented reality technology Implementation method based on physical detection and tracking algorithm

A tracking algorithm and augmented reality technology, applied in the field of computer vision, can solve problems such as object matching interference, three-dimensional object detection and tracking limitations, etc.

Inactive Publication Date: 2017-06-13
成都弥知科技有限公司
View PDF5 Cites 10 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, there may not be rich feature points in the 3D object, or even extreme cases without feature points, such as the 3D printed model, which interferes with the matching of the object. Therefore, the detection of the object based on the feature points and Tracking has limitations

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0036] A method for implementing an augmented reality technology based on a physical object detection and tracking algorithm, comprising the following steps:

[0037] 1), generate the local marker file, including the template file and the 3D model pose matrix file corresponding to the template file;

[0038] 2), initialize the augmented reality system;

[0039] 3) Obtain images of real scenes, extract and optimize image edges;

[0040] 4), template matching is performed on the edge of the image, and if the matching is successful, a decision window is generated in this area;

[0041] 5) Import the pose matrix corresponding to the template, converge the template to the global optimal solution, and fine-tune the pose matrix;

[0042] 6) Render augmented reality special effects according to the corrected pose matrix;

[0043] 7) Extract the corner points in the decision window generated by the template of the optimal solution, and select an appropriate tracking algorithm accord...

Embodiment 2

[0047] This embodiment refines each step on the basis of Embodiment 1.

[0048] Specifically, in step 2), the augmented reality system can be implemented on a device with a camera, such as a mobile phone, a tablet computer, smart glasses or a helmet. The initialization of the augmented reality system mainly includes two aspects: 1. Calibration and initialization of the camera, which is used to obtain real scene images. The initialization of the camera specifically refers to reading the internal inherent parameters such as the focal length and deformation of the camera into the memory; 2. Amplification The reality system reads the pre-stored local data needed to implement the technology, including target marker files and 3D model information.

[0049] Obtain an image of a real scene through a camera and other equipment with a frame rate of 30FPS, extract the edge of the image and optimize it into an edge line segment.

[0050] The augmented reality system reads in the template...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an augmented reality technology Implementation method based on physical detection and tracking algorithm. The method comprises the following steps that a local marker file is generated, an augmented reality system is initialized, an image of a real scene is acquired, the edge of the image is extracted and optimized, template match is conducted on the edge of the image, and if the matching succeeds, a decision window in the region is generated, a position matrix corresponding to the template is imported, the position matrix is conducted to fine adjust and amend, the special effect of the augmented reality is rendered according to the amended position matrix, corners are extracted in the decision window, appropriate tracking algorithms are chosen according to the number of the corners, If the number of the corners is larger than the threshold value, the system enters the feature point tracking algorithm. If the number of the corners is smaller than the threshold value, the system enters the edge tracking algorithm, animation effects of the augmented reality are updated according to the tracking algorithms, different tracking algorithms are adopted according to the number of the corners, then the detection and tracking for three-dimensional physicals are achieved when rich feature points do not exist or without feature points.

Description

technical field [0001] The invention relates to the field of computer vision, in particular to a realization method of an augmented reality technology based on an object detection and tracking algorithm. Background technique [0002] The full name of Augmented Reality technology is Augmented Reality, that is, AR. It is a technology that integrates information such as visual effects, sound effects, and spatial information in the virtual world into real environmental information. Augmented Reality technology not only displays the information of the real environment, but also integrates The virtual information is displayed, and the two kinds of information are complemented and superimposed, so that the user can obtain richer perceptual information. Usually, electronic devices equipped with augmented reality technology can use the camera configured on it to The lens captures the images of the real environment, and calculates the position and angle of the captured images in real ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00
CPCG06V20/20
Inventor 施茂燊
Owner 成都弥知科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products