Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Monocular vision and quick-response code road sign based indoor autonomous navigation method

An autonomous navigation and monocular vision technology, applied in the field of indoor navigation, can solve the problems of small field of view, poor flexibility, restricting the maximum running speed of the robot, etc., to achieve the effect of solving the complex calculation and meeting the accuracy requirements

Inactive Publication Date: 2017-07-21
北京品创智能科技有限公司
View PDF18 Cites 60 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Benefits of technology

In this new technology for navigating robots that uses both visual sensors (monocles) and 2D codes, there has been developed an algorithm called Simultaneous Localization And Mapping (SLAM). It allows the robot to accurately locate itself within its environment without relying solely upon cameras or other sources like sonars. By calculating these data with reference points inside the map database, Slamming Navigation System (SLNS), which helps guide them towards their destinations while avoiding obstacles such as walls or furniture, improves efficiency and precision compared to traditional methods.

Problems solved by technology

There exist technical solutions involving different approaches like sensory navigations systems with visual guidance, radioactive ruler markers, lasers, ultrasonic waves, and cameras. These existing ways suffer from various limitations including limited range resolution, difficulty in estimating distances due to factors like soiliness, moisture, temperature variation, and atmospheric turbulence. Additionally, these conventional methods may involve expensive equipment costs and complicated procedures requiring highly skilled operators who lack expert knowledge.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Monocular vision and quick-response code road sign based indoor autonomous navigation method
  • Monocular vision and quick-response code road sign based indoor autonomous navigation method
  • Monocular vision and quick-response code road sign based indoor autonomous navigation method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0046] The technical content of the present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0047] Such as figure 1As shown, the indoor autonomous navigation method based on monocular vision and two-dimensional code landmarks provided by the present invention includes the following steps: first, obtain the initial position of the robot, and set the two-dimensional code route according to the initial position and target position of the robot. Secondly, the monocular camera of the robot detects and recognizes the two-dimensional code in real time from the image, and calculates the angle and distance between the robot's current driving direction and position and the detected two-dimensional code landmark position according to the internal and external parameters of the monocular camera relationship, determine the current navigation information of the robot; then, according to the navigation direction of the rob...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a monocular vision and quick-response code road sign based indoor autonomous navigation method. The method includes the steps of S1), acquiring an initial position of a robot, and setting a quick-response route according to the initial position and a target position of the robot; S2, detecting a quick-response code in real time by a monocular camera of the robot, calculating relative position relation among the robot and the quick-response code according to internal and external parameters and determining current navigation information of the robot; S3, performing motion according to the current navigation information of the robot, judging whether a destination address is the target position or not, if yes, completing navigation, or otherwise, performing angle adjustment according to the navigation direction of the quick-response code and turning the step S2). With the method, the problem about complicated calculation of the monocular camera can be solved, and the requirement on accuracy of indoor autonomous navigation can be met.

Description

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Owner 北京品创智能科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products