Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Vision-based integrated navigation robot and navigation method

A technology that combines navigation and navigation methods. It is applied in directions such as navigation, mapping and navigation, and two-dimensional position/channel control. It can solve the problems of accumulated errors in navigation methods, reduced reliability, and visual navigation light interference.

Active Publication Date: 2016-03-09
HUBEI SANJIANG AEROSPACE HONGFENG CONTROL
View PDF8 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] There are many navigation methods for mobile robot autonomous navigation, mainly including odometer, visual navigation, gyroscope or strapdown inertial navigation, ultrasonic sensor navigation, laser ranging radar navigation, etc. Although each sensor has the function of navigation, a single sensor There are always some deficiencies in navigation, such as odometer and gyroscope navigation methods have cumulative error problems, and visual navigation has problems such as light interference that leads to reduced reliability.
Although laser ranging radar and ultrasonic sensors can also navigate, they need more reference objects

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Vision-based integrated navigation robot and navigation method
  • Vision-based integrated navigation robot and navigation method
  • Vision-based integrated navigation robot and navigation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] Such as figure 1 As shown, the robot involved in the present invention is a self-navigating robot that can adapt to various environments. It adopts a four-wheel drive trolley structure and mainly includes a car body, a color digital camera 1, an ultrasonic sensor 2, wheels 3, and a motor deceleration Device combination 4, photoelectric encoder 5, gyroscope 6, wireless network card 7, notebook computer (as a console) 8 and motion controller, industrial computer, image acquisition card, data acquisition and processing board, lithium battery, etc.

[0029] The robot is a four-wheel drive car. Each wheel 3 is equipped with a set of motor reducer combination 4. The four sets of motors output torque at the same time to ensure that the robot has sufficient power and enhances motion performance. The robot turns using a differential turning method. The color digital camera 1 is installed on the front end above the robot car body, collects video image information through the colo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a vision-based combined navigation robot and a navigation method. The robot comprises a four-wheel drive trolley and a color digital camera arranged on the body of the trolley; a plurality of ultrasonic sensors which are arranged at the front and the rear ends of the trolley body and used for detecting the distance information of obstacles around the robot; a gyroscope mounted inside the trolley body and used for detecting the attitude information of the robot; four photoelectric encoders which are arranged on four sets of servo drive motors respectively and used as speedometers; a motor controller; and a robot trolley body control system computer used for ensuring the real time performance of image processing and control. The navigation method provided by the invention mainly works based on the visual navigation in combination with the related information of the speedometer, the gyroscope and the ultrasonic sensors, resulting in combined navigation, so as to improve the reliability and navigation precision of the system to the greatest extent.

Description

technical field [0001] The invention relates to mobile technology for robot autonomous navigation. Background technique [0002] There are many navigation methods for mobile robot autonomous navigation, mainly including odometer, visual navigation, gyroscope or strapdown inertial navigation, ultrasonic sensor navigation, laser ranging radar navigation, etc. Although each sensor has the function of navigation, a single sensor There are always some deficiencies in navigation, such as odometer and gyroscope navigation methods have cumulative error problems, and visual navigation has problems such as light interference that leads to reduced reliability. Although laser ranging radar and ultrasonic sensors can also navigate, they need more reference objects. [0003] In view of the above problems, the present invention adopts a combined navigation method that focuses on visual navigation and simultaneously integrates sensors such as photoelectric encoders, ultrasonic waves, and g...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G05D1/02G01C21/00
Inventor 魏焕兵杨玉枝王勇刚姚晓峰
Owner HUBEI SANJIANG AEROSPACE HONGFENG CONTROL
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products