Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Binocular vision indoor positioning and mapping method and device

A binocular vision and indoor positioning technology, applied in the field of positioning and navigation, can solve the problems of poor relocation ability, poor navigation effect, and high requirements for scene texture, and achieve the goal of improving accuracy and robustness, good accuracy and robustness. Effect

Active Publication Date: 2019-07-23
SOUTHEAST UNIV
View PDF5 Cites 57 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The technical problem to be solved by the present invention is to solve the problems of high scene texture requirements, poor relocation ability and poor navigation effect in the existing indoor navigation mode, and provide a binocular vision indoor positioning and mapping method and device, through The indoor positioning method based on the fusion of visual and inertial navigation units can perform positioning and map building in complex scenes

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Binocular vision indoor positioning and mapping method and device
  • Binocular vision indoor positioning and mapping method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0037] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the present invention will be described below in conjunction with the accompanying drawings. Apparently, the described embodiments are some, not all, embodiments of the present invention.

[0038] Such as figure 1 As shown, the present invention designs a binocular vision indoor positioning and mapping method, which can perform multi-sensor fusion positioning and mapping, and realizes the functions of positioning, mapping and autonomous navigation in complex scenes. The method includes the following steps :

[0039] Step 1. Collect the left and right images in real time through the binocular vision sensor, and calculate the initial pose of the camera based on the left and right images.

[0040] First, for the left and right images obtained by the binocular vision sensor, if the brightness of the image is too high or too low, i...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a binocular vision indoor positioning and mapping method and device. The method comprises the following steps of collecting left and right images in real time, and calculatingthe initial pose of the camera; collecting angular velocity information and acceleration information in real time, and pre-integrating to obtain the state of an inertial measurement unit; constructinga sliding window containing several image frames, and nonlinearly optimizing the initial pose of the camera by taking the visual error term between the image frames and the error term of the measurement value of the inertial measurement unit as constraints to obtain the optimized pose of the camera and measurement value of the inertial measurement unit; constructing word bag models for loop detection, and correcting the optimized pose of the camera; extracting and converting features of the left and right image into words for matching with the word bags of the offline map, optimizing and solving to obtain the optimized pose of the camera if the match is successful, and re-collecting the left and right images and matching the word bags if the match is unsuccessful. The binocular vision indoor positioning and mapping method and device provided by the invention can realize positioning and mapping in an unknown environment and the positioning function in the already constructed scene, andhas good precision and robustness.

Description

technical field [0001] The invention relates to a binocular vision indoor positioning and mapping method and device, belonging to the technical field of positioning and navigation. Background technique [0002] At present, indoor robots are increasingly used in large shopping malls, warehouses, and homes, such as shopping guide robots in shopping malls, intelligent storage robots, and home sweepers. In these application fields, the robot needs to complete autonomous navigation, and the completion of autonomous navigation first requires the robot to have the function of indoor positioning, that is to say, the robot needs to know its current location information in the scene and the location information of the destination. [0003] At present, there is no mature indoor high-precision positioning solution. The GPS (Global Positioning System, Global Positioning System) solution has large positioning errors indoors; the solution relies on pasting QR code labels in the scene to c...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/20G01C21/16
CPCG01C21/206G01C21/165Y02T10/40
Inventor 李冰卢泽张林王亚洲高猛刘勇董乾王刚赵霞
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products