Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Mobile robot visual navigation method based on deep learning

A mobile robot and deep learning technology, which is applied in the field of visual navigation of mobile robots based on deep learning, can solve the problems of inability to recognize objects and high price of lidar, and achieve the effect of low cost

Inactive Publication Date: 2019-02-15
BEIJING UNIV OF TECH
View PDF2 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Lidar technology is relatively mature and has high precision. However, Lidar is often expensive and has mirror problems. It can only detect the position and direction of the robot itself at a physical distance, and cannot identify indoor objects.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Mobile robot visual navigation method based on deep learning
  • Mobile robot visual navigation method based on deep learning
  • Mobile robot visual navigation method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0034] Embodiments of the present invention are described in detail below. This embodiment is implemented under the premise of the technical solution of the present invention, and detailed implementation and specific operation process are provided, but the protection scope of the present invention is not limited to the following embodiments.

[0035] Step 1: The robot randomly explores in the unknown environment, and takes images at a given frequency, and names and saves the images according to the shooting time sequence.

[0036] This embodiment is an indoor environment of a building. Since the robot is in an unknown environment, the robot needs to conduct random exploration in the environmental space to recognize the environment, and record the spatial information of the environment through random exploration.

[0037] The RGB-D depth camera collects images from various angles in the environment space at a high frequency. The image collection in this step only saves the color...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a mobile robot visual navigation method based on deep learning, which includes: a mobile robot performing image (color images) acquisition in an unknown environment space through its in-built depth camera, processing the collected images to obtain a preset size picture, marking the picture with a marking tool (such as LabelImg), training the color images with a deep learning method to obtain a target detection model, and inputting the target. The robot explores the unknown environment and collects real time images (including color images and depth images). The color image is used as the input of the target detection model to detect the target. After detecting the target, the direction and the distance of the target position to the robot are calculated according to the depth map to generate a motion strategy.

Description

technical field [0001] The invention relates to the technical fields of target detection, deep learning, and indoor navigation of mobile robots, in particular to a visual navigation method for mobile robots based on deep learning. Background technique [0002] In recent years, research on intelligent robots has attracted widespread attention, both in academia and industry. [0003] At present, the positioning technology that is relatively mature and widely used is the Global Positioning System (GPS) positioning technology. The positioning method of this positioning technology is that mobile devices such as vehicles and mobile phones can position themselves by carrying GPS modules, so as to realize navigation. GPS positioning technology requires mobile devices to be able to receive GPS signals, which is suitable for outdoor open scenes, but inside buildings, GPS signals are often weak, unable to provide precise location information for robots, and the navigation effect is no...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/20G06T7/73G06T7/55
CPCG01C21/206G06T2207/10024G06T2207/10028G06T2207/20081G06T2207/20084G06T7/55G06T7/73
Inventor 阮晓钢任顶奇朱晓庆刘少达李昂武悦
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products