Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Indoor mobile robot navigation method

A mobile robot and navigation method technology, applied in the field of indoor mobile robot navigation, can solve problems such as time-consuming and laborious, and achieve the effect of reducing the amount of calculation

Pending Publication Date: 2022-02-08
杭州景吾智能科技有限公司
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, this patent document still has the defect that it needs to be deployed in advance, which is time-consuming and labor-intensive.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor mobile robot navigation method
  • Indoor mobile robot navigation method
  • Indoor mobile robot navigation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0044] This embodiment provides a navigation method for an indoor mobile robot, including the following steps:

[0045] Step 1: The robot body independently explores and takes photos through the RGBD camera, obtains photos, enters the room from a set starting point, and navigates to control the robot body to complete a slow rotation of 180 degrees from right to left. During the rotation, set the angle at intervals, The RGBD camera collects image data once. The RGBD camera collects the color image and depth image of the current position, and records the current angle of the robot body to take pictures. The set angle is not greater than the horizontal viewing angle of the RGBD camera.

[0046] Step 2: the robot body carries out target detection and recognition according to the photos taken by the RGBD camera, and step 2 includes the following steps:

[0047] Step 2.1: Each time the image data is collected, it is sent to the trained deep learning model for detection, the target o...

Embodiment 2

[0063] Those skilled in the art can understand this embodiment as a more specific description of Embodiment 1.

[0064] like Figure 1~3 As shown, this embodiment provides a navigation method for autonomous exploration of an unknown environment based on target recognition. This navigation method is based on a system composed of a mobile robot and an RGBD camera, where the RGBD camera can collect color images and depth images. Installed horizontally in front of the robot.

[0065] This method comprises the steps:

[0066] Step 1: The robot autonomously explores and takes pictures;

[0067] Step 2: target detection and recognition;

[0068] Step 3: Estimation of the angle and distance of the target;

[0069] Step 4: Object navigation rules.

[0070] Wherein, step 1 includes the following steps:

[0071] Step 1.1: Enter the room from a set starting point, and navigate and control the body of the robot to complete a slow rotation of 180 degrees from right to left. During the...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an indoor mobile robot navigation method. The method comprises the following steps that: step 1, a robot body performs autonomous exploration photographing through an RGBD camera to obtain a picture; 2, the robot body carries out target detection and recognition according to a picture shot by the RGBD camera; 3, the robot body estimates the angle and the distance of a target object according to a target detection and recognition result; and 4, the robot body formulates a target navigation rule according to estimation of the angle and the distance of the target object. The method does not need to be deployed in advance, and can adapt to different indoor environments.

Description

technical field [0001] The invention relates to the technical field of robots, in particular to a navigation method for an indoor mobile robot. Background technique [0002] With the faster and faster working pace of modern office workers and the rising cost of human labor, the application of robots in daily life will become more and more extensive. Unlike industrial robots deployed in a specific production environment, indoor mobile robots face a variety of indoor layout environments. Therefore, indoor robots need to have navigation capabilities that can adapt to different indoor environments. Usually the industry uses the method of building indoor maps in advance and manually marking the target location to guide the robot to navigate, but this method needs to be deployed in advance, which is time-consuming and laborious. [0003] The patent document with the publication number CN107450540A discloses an indoor mobile robot navigation system and method based on infrared roa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G01C21/20G06V20/00G06V10/82G06N3/02
CPCG01C21/206G06N3/02Y02T10/40
Inventor 夏子涛郭震
Owner 杭州景吾智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products