Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Inertial sensor and visual sensor data fusion algorithm

An inertial sensor and visual sensor technology, applied in the field of positioning, can solve the problems of high delay, low three-dimensional space coordinate frequency, inability to adapt to application scenarios with high three-dimensional space positioning delay and high frequency requirements, etc., to improve the output frequency, The effect of reducing positioning delay

Active Publication Date: 2018-06-01
北京轻威科技有限责任公司
View PDF8 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The present invention provides an inertial sensor and visual sensor data fusion algorithm to solve the problem that the frequency of outputting three-dimensional space coordinates in the existing three-dimensional space coordinate positioning scheme is not high and the delay is too high, which cannot meet the delay and frequency requirements of VR and other three-dimensional space positioning. Problems with higher application scenarios

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Inertial sensor and visual sensor data fusion algorithm
  • Inertial sensor and visual sensor data fusion algorithm
  • Inertial sensor and visual sensor data fusion algorithm

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] Hereinafter, the present invention will be described in detail.

[0038] Unless defined otherwise, all technical and scientific terms used in this application have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. All patents and publications mentioned in this application are hereby incorporated by reference.

[0039] According to an inertial sensor and visual sensor data fusion algorithm involved in an embodiment of the present invention, the first positioning data of the positioning point collected by the inertial sensor and the second positioning data of the positioning point collected by the visual sensor perform the positioning information of the specific point In addition, the algorithm can also determine the positioning information of a specific point according to the first positioning data when the visual sensor does not collect the second positioning data, so that the positioning information can be dete...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

An embodiment of the invention provides an inertial sensor and visual sensor data fusion algorithm. With the adoption of the provided inertial sensor and visual sensor data fusion positioning method,positioning information is determined according to first positioning data, acquired by an inertial sensor, of a positioning point and second positioning data, acquired by a visual sensor, of the positioning point, or the positioning information is determined according to the first positioning data when the second positioning data acquired by the visual sensor are not acquired, so that the positioning information is not required to be always determined according to the second positioning data acquired by the visual sensor when two-dimensional space coordinates and / or three-dimensional space coordinates are determined, the positioning information can be still output when the frequency of the visual sensor is lower or the accurate positioning data cannot be determined according to acquired images, the output frequency of the positioning information is improved and positioning delay is reduced.

Description

technical field [0001] The invention relates to the field of positioning technology, in particular to an inertial sensor and visual sensor data fusion algorithm. Background technique [0002] With the rapid development of science and technology, the demand for the dimension of interaction between human beings and computer equipment continues to increase. For a long time, the interaction between humans and computers was mainly in two-dimensional space, such as flat-panel monitors and mice and touchpads. Using two-dimensional space to describe three-dimensional objects will cause inconvenience and certain limitations, especially in today's development of three-dimensional design and three-dimensional entertainment, users cannot see the three-dimensional space in the design through natural action interaction. [0003] The emergence of virtual reality (VR) has elevated the interaction between humans and computers to three dimensions. People can use their own body movements, bo...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/16G01C21/20G06K9/62
CPCG01C21/165G01C21/20G06F18/25
Inventor 黄永鑫路晗张莹
Owner 北京轻威科技有限责任公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products