Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

GPS-based binocular fusion positioning method and device

A fusion positioning, dual-purpose technology, applied in the field of computer vision, can solve the problems of increased positioning error, loss of GPS signals, and low update frequency, and achieve the effects of improved positioning accuracy, high positioning accuracy, and high positioning accuracy.

Active Publication Date: 2017-02-15
CHENGDU TOPPLUSVISION TECH CO LTD
View PDF8 Cites 17 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

But using GPS for positioning, the object can only be passively positioned by the information sent by the GPS device it carries, because the GPS positioning accuracy is only 2.5 meters under ideal conditions, and the update frequency is low
In addition, for indoor objects, the GPS signal will be lost due to occlusion, and indoor positioning cannot be achieved, resulting in increased positioning errors. Therefore, the object does not know its exact location and cannot achieve accurate positioning.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • GPS-based binocular fusion positioning method and device
  • GPS-based binocular fusion positioning method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] All features disclosed in this specification, or steps in all methods or processes disclosed, may be combined in any manner, except for mutually exclusive features and / or steps.

[0039] Any feature disclosed in this specification, unless specifically stated, can be replaced by other alternative features that are equivalent or have similar purposes. That is, unless expressly stated otherwise, each feature is one example only of a series of equivalent or similar features.

[0040] Such as figure 1 Shown, the inventive method comprises following three steps:

[0041] 1) Obtain the space where the object to be located, such as a three-dimensional map of a room or a building. The three-dimensional map contains at least image features of the space environment, such as brightness feature values, texture feature values, geometric feature values, and other image-related feature values. Three-dimensional The map should also contain the earth coordinates of each point in space....

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a GPS-based binocular fusion positioning method and a GPS-based binocular fusion positioning device, which relate to a computer vision technology, in particular to a real-time object positioning method, and is suitable for self-positioning of various objects, such as self-positioning of an unmanned aerial vehicle or a robot. The GPS-based binocular fusion positioning method comprises the steps of: step 1, acquiring a three-dimensional map of a space in which an object to be positioned locates; step 2, acquiring longitude and latitude of the object to be positioned, so as to obtain a coarse positioning range of the object to be positioned in the three-dimensional map according to the longitude and latitude of the object to be positioned; step 3, and using a binocular measurement visual system arranged on the object to be positioned for shooting an image of reference objects surrounding the object to be positioned, performing three-dimensional reconstruction on the reference objects according to the image shot by the binocular measurement visual system to obtain a three-dimensional graph of the reference objects, looking for a region matched with the three-dimensional graph in the coarse positioning range, and further determining a precise positioning region of the reference objects in the three-dimensional map.

Description

technical field [0001] The invention relates to computer vision technology, especially a real-time object positioning method, especially a method for precise positioning of objects in a large-scale environment. Background technique [0002] Achieving visual positioning has always been a technical problem in the intelligentization of drones or robots. To solve the problem of autonomous flight of drones, it is first necessary to confirm their own location information. However, the commonly used visual positioning algorithm has too much calculation to be applied to large-scale outdoor environments, and it has a certain degree of error. The precise positioning of drones or robots outdoors depends on GPS navigation. But using GPS for positioning, the object can only be passively positioned by the information sent by the GPS device it carries, because the GPS positioning accuracy is only 2.5 meters under ideal conditions, and the update frequency is low. In addition, for indoor...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/33
Inventor 徐一丹龙学军周剑何佳蓉
Owner CHENGDU TOPPLUSVISION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products