Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Robot navigation method based on visual perception and spatial cognitive neural mechanism

A technology of spatial cognition and visual perception, applied in the field of brain-like navigation, can solve the problems of high difficulty in brain science exploration and few breakthroughs, and achieve the effect of improving bionics, strong autonomy, and reducing computational complexity

Active Publication Date: 2019-01-18
SHENYANG INST OF AUTOMATION - CHINESE ACAD OF SCI
View PDF4 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this research field has great impetus for the further development of robot navigation technology, due to a series of factors such as the difficulty of brain science exploration, there are still few breakthroughs in this research.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Robot navigation method based on visual perception and spatial cognitive neural mechanism
  • Robot navigation method based on visual perception and spatial cognitive neural mechanism
  • Robot navigation method based on visual perception and spatial cognitive neural mechanism

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0081] Example 1 Robot Navigation under Illumination

[0082] When illuminated, the system is stimulated by multiple sources of visual images and robot motion. Numerous location nodes at time t form such as figure 2 The discharge bump is shown. like Figures 3a to 3d As shown, x and y represent positions in the x and y directions, respectively. The motion of the bump on the spatial cortex is proportional to the motion of the robot in the simulated environment, and the normalized difference is close to zero. Therefore, the system designed by the present invention is dynamic, realizes the transformation between the robot motion plane and the spatial cortex, and helps the robot to learn a two-dimensional spatial cognitive map for the environment in which it is located. refer to Figure 4 , when the robot is in different head orientations, the spatial responses of the position nodes are consistent, forming a discharge pattern similar to the "position field" of the position c...

Embodiment 2

[0083] Example 2 Robot Navigation in Darkness

[0084] In the dark, the stimulation of the visual image to the system disappears, but the bump formed by the position node in the spatial cortex at time t still exists, and as shown in Fig. Figures 6a-6b The shown has similar dynamics as in Example 1, ie the movement of the bump in the spatial cortex is proportional to the speed of the robot.

Embodiment 3

[0085] Example 3 Robot navigation in the presence of motion noise

[0086] When there is random noise in the speed information of the robot, the system is still anti-interference, such as Figure 7a As shown in ~7d, the normalized error between the motion of the bump and the actual robot motion fluctuates in a small range around 0.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a robot navigation method based on visual perception and a spatial cognitive neural mechanism. The collected visual images are transformed into visual nodes representing position and orientation angle information of the robot through a neural network to form a visual cell; a visual code of the visual cell is transformed into a spatial description of the environment, and acognitive map, similar to what is formed in the brain when the mammal freely moves, is constructed; and positioning and navigation of the robot are realized based on the cognitive map. According to aneural computation system of environment perception and spatial memory, the robot completes a series of tasks such as visual processing, spatial representation, self-positioning, and map update, thereby realizing robotic navigation with high bionics and strong autonomy in an unknown environment. Compared with the traditional simultaneous localization and mapping SLAM technology, the robot navigation method based on visual perception and the spatial cognitive neural mechanism in the invention avoid a series of complex calculations such as manual design visual features and feature point matchingand greatly improve the robustness of the system to factors such as illumination changes, viewing angle changes, object motion and the like in the natural environment.

Description

technical field [0001] The invention relates to a brain-like navigation method. Specifically, it is a system that utilizes visual perception and spatial cognition neurocomputing mechanisms for robots to navigate in unknown environments. Background technique [0002] Robot autonomous navigation research mainly focuses on: "where" (positioning); "where to go" (path planning). Although the existing navigation technology solves these two problems to a certain extent, it still has great defects, such as the low positioning accuracy of GPS technology, and the inability to provide navigation information normally in special or hidden environments such as indoors, underwater, and after disasters. ; The traditional robot localization and map construction technology (SLAM) relies on expensive sensors such as odometer and laser, and the spatial perception information is single. Visual navigation has become a research hotspot in recent years due to its rich perceptual information sourc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G05D1/02
CPCG05D1/0253G05D1/0276
Inventor 斯白露赵冬晔
Owner SHENYANG INST OF AUTOMATION - CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products