Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Indoor robot positioning method by combining visual odometer and physical odometer

A visual odometry and indoor robot technology, applied in the field of autonomous positioning accuracy of indoor mobile robots, can solve problems such as inability to eliminate, superposition and accumulation, and achieve the effect of ensuring efficiency and real-time performance, satisfying accuracy, and solving the problem of error accumulation.

Active Publication Date: 2017-11-17
QINGDAO KRUND ROBOT CO LTD
View PDF5 Cites 38 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The existing technology can satisfy the robot in an indoor environment with a simple structure and a small area through the improved Monte Carlo particle filter and the positioning method of the physical odometer. However, the physical odometer is calculated by the displacement increment of two time periods. , it only considers local motion, so the error will continue to accumulate until the drift is too large to be eliminated, especially when the wheel slips or tilts, the positioning error will be greater

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor robot positioning method by combining visual odometer and physical odometer
  • Indoor robot positioning method by combining visual odometer and physical odometer

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0018] The present invention will be described in detail below with reference to the drawings and embodiments.

[0019] As attached figure 1 with 2 As shown, the present invention provides a

[0020] Step 1. Use ASUS depth camera Xtion to acquire color and depth images;

[0021] Step 2. Extract ORB features from the two consecutive images obtained, calculate the descriptor of each orb feature point, and estimate the camera pose change through feature matching between adjacent images: 1) Combine the depth image to obtain the effective feature point 2) According to the orb feature and the depth value of the feature point, the RANSAC algorithm eliminates the wrong point pair; 3) The rotation matrix R and the translation matrix T between adjacent images are obtained to estimate the pose transformation of the camera;

[0022] Step 3. During the movement of the robot, select the image with the most common feature points and the best matching among the adjacent frames as the key frame, and s...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an indoor robot positioning method by combining a visual odometer and a physical odometer. The method is characterized in that by collecting the images, ORB characteristics are extracted for image coupling, camera position and pose estimation, and closed loop detection to accurately position the robot. The visual odometer is added in a known environment for closed loop detection on the robot, the accumulated error of the physical odometer based on particle filter is eliminated, the global error of the odometer is changed to staggered accumulation, and an closed is established based on the above process. The visual odometer is combined for effectively solving the problem of error accumulation of the physical odometer, and realizes automatic positioning and accurate re-positioning of the robot in the known environment, the increased computation burden is not large, efficiency and real-time performance are guaranteed, indoor navigation requirement can be preciously satisfied, and the indoor robot positioning method is effective for solving the problem of inaccurate positioning of the robot under large environment in the prior art.

Description

Technical field [0001] The invention relates to a method for autonomous positioning accuracy of an indoor mobile robot, in particular to an indoor robot positioning method that integrates a visual odometer and a physical odometer. Background technique [0002] In the related research on the intelligent navigation technology of autonomous mobile robots, the simultaneous positioning and map construction (SLAM) technology of robots in unknown environments is a key technology, which has both engineering and academic dual values. Research hotspots in the field. Under this trend, scholars have proposed a variety of methods to solve the SLAM problem, and also applied a variety of sensors to solve the environmental perception problem in SLAM. [0003] The first problem to be solved by SLAM technology is to select an appropriate sensor system to achieve real-time positioning of the robot. In practical applications, sensors based on lidar that have high accuracy in ranging and azimuth are ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/20
CPCG01C21/206
Inventor 周唐恺江济良王运志
Owner QINGDAO KRUND ROBOT CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products