Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Road detection method based on bimodal data fusion

A technology of road detection and data fusion, applied in image data processing, character and pattern recognition, instruments, etc., can solve the problems of unrobust changes in ambient light, large shadow effects, and inability to detect areas without data return from sensors.

Active Publication Date: 2013-07-10
ZHEJIANG UNIV
View PDF3 Cites 45 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The purpose of the present invention is to propose a road detection method based on dual-modal data fusion, which solves the problem that the traditional road detection based on single-modal data is not robust to changes in ambient light, is greatly affected by shadows on the road, and has a large impact on road signposts, The detection of the guardrail is not sensitive, and it cannot detect the problem that the sensor has no data return area

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Road detection method based on bimodal data fusion
  • Road detection method based on bimodal data fusion
  • Road detection method based on bimodal data fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0067] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0068]The present invention adopts the original data in the KITTI data set, including the color image of pixels, laser radar data points, and the rotation and translation matrix between the laser radar and the camera [R LC t LC ], conduct experiments such as figure 1 As shown, the implementation steps are as follows:

[0069] 1.1) Coordinate system conversion of the three-dimensional point of the laser radar: the three-dimensional coordinate system of the laser radar, including the three-dimensional point X of the laser radar under the x-axis, y-axis and z-axis L =(x l ,y l ,z l ) T By left multiplying the rotation-translation matrix between the lidar and the camera [R LC t LC ], converted to the lidar 3D point X in the camera 3D coordinate system C =(x c ,y c ,z c ) T , where R LC Indicates the rotation matrix between the lidar coordina...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a road detection method based on bimodal data fusion. The method comprises the following steps: transferring axes of laser radar three-dimensional points; obtaining a laser radar three-dimensional point set in a camera image range; setting a height threshold value, obtaining a subset of laser radar three-dimensional points of a barrier; obtaining a subset of laser radar three-dimensional points of a passable road area; obtaining color Gaussian mixture models of a non-road area and a passable road area; obtaining an area-of-interest of road detection on an image; structuring an energy function based on the Markov random field, calculating the minimum value of the energy function, and obtaining a global optimal image road detection result. By the adoption of the road detection method based on the bimodal data fusion, robust of complex environment is achieved, the detection result is not influenced by shadows of road surface, fine barriers in an environment sensed by a sensor can be distinguished from the passable road area; an area without laser radar data returning and the non-road can be judged, and the road detection method based on the bimodal data fusion is suitable for fields such as autonomous vehicle guidance.

Description

technical field [0001] The invention relates to image segmentation technology, in particular to a road detection method based on dual-mode data fusion. Background technique [0002] Sensor-based road detection algorithms play a vital role in the field of autonomous navigation of mobile robots. Traditional road detection algorithms are based on a single sensor, such as camera, lidar, etc. There are many deficiencies in the perception of the surrounding environment of mobile robots based on a single sensor. The monocular camera can provide the color information of the environment, but the sensor has high requirements for the invariance of the lighting environment, and is not robust to shadows and environments with small color contrast in road detection. The binocular camera can make up for the poor detection effect of the monocular camera in the case of changes in ambient light, mottled shadows, and low color contrast by acquiring the 3D information of the environment, but t...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06T7/00
Inventor 黄文琦龚小谨刘济林
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products