Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Blind autonomous navigation method based on stereoscopic vision and information fusion

A technology of stereo vision and autonomous navigation, applied in the field of navigation system, can solve the problems such as difficult separation of main objects, lack of autonomous navigation algorithm, increase of obstacle misjudgment rate, etc., so as to improve autonomy and independent walking ability, application prospect Visible, performance-enhancing effects

Inactive Publication Date: 2010-05-05
常州超媒体与感知技术研究所有限公司 +1
View PDF0 Cites 43 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, blind guide aids designed based on stereo vision methods are easily affected by ambient light sources and occluders. The noise generated by stereo matching makes it difficult for the main target to be separated from the background, resulting in an increase in the misjudgment rate of obstacles, and the obtained target depth information Often there is a large error
Ultrasonic sensors have high distance detection accuracy, but their use in complex environments is limited
The current guide aids are mostly based on one of the above two, combined with other sensors to detect obstacles, lack of autonomous navigation algorithms that effectively use the advantages of different sensors to complement each other

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Blind autonomous navigation method based on stereoscopic vision and information fusion
  • Blind autonomous navigation method based on stereoscopic vision and information fusion
  • Blind autonomous navigation method based on stereoscopic vision and information fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0019] The present invention will now be described in further detail in conjunction with the accompanying drawings and preferred embodiments. These drawings are all simplified schematic diagrams, which only illustrate the basic structure of the present invention in a schematic manner, so they only show the configurations related to the present invention.

[0020] figure 2 It is a schematic diagram of the system software of the present invention. In the figure, a system software structure is built based on an embedded hardware platform and a Linux real-time operating system, mainly divided into multi-sensor data acquisition, data communication and transmission storage, information processing and fusion, information display and user interface. parts. The navigation algorithm mainly focuses on stereo vision and multi-sensor information fusion. The algorithm flow chart is as follows: figure 1 As shown in the figure, the output of the left image and the right image are mainly co...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of navigation systems, in particular to a blind autonomous navigation method based on the stereoscopic vision and the information fusion. The output part of a left image and right image is connected with an image preprocessing part, the output of the image preprocessing part is connected with an image segmentation part, the image segmentation part is connected with a characteristic extraction part, the output of the characteristic extraction part is connected with a stereographic matching part, the output of the stereographic matching part is connected with a parallax chart obtaining part, the output of the parallax chart obtaining part is connected with a depth image obtaining part and a target detection and marking part, the output of the depth image obtaining part is connected with the target detection and marking part, the output of the target detection and marking part is connected with a target size, distance, direction and calculation participation part, the output of the target size, distance, direction and calculation participation part is connected with a multi-sensor information fusion part, the output of the multi-sensor information fusion part is connected with a voice output part of information, such as obstacle position, size, distance, direction and the like, and the outputs of an ultrasonic wave ranging part, an electronic compass direction finding part and a GPS positioning part are connected with the multi-sensor information fusion part. In the invention, the algorithm has great use value for improving the autonomy and the independent walking ability of the blind.

Description

technical field [0001] The invention relates to a navigation system, in particular to an autonomous navigation method for blind people based on stereo vision and information fusion. Background technique [0002] There are tens of millions of disadvantaged groups in our country. When they travel, they will face various complex and changeable traffic environments. The increasing number of motor vehicles and various obstacles on the road will threaten the travel of the blind. One is walking, and the other is traveling by car. The walking of the blind requires a navigator or a navigation dog. [0003] Stereo vision (stereo vision) is one of the core technologies of computer vision, which can be used to construct the depth information of objects in the environment in the image. Binocular vision is a stereoscopic perception system that is widely concerned at present. It has the advantages of compact structure, low energy consumption, rich information, and high recognition accurac...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G01C21/28G06F3/01G06T7/00
Inventor 刘云辉杨延光刘顺范才智周东翔蔡宣平
Owner 常州超媒体与感知技术研究所有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products