Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Autonomous system sensing method and system based on multi-modal fusion

An autonomous system and multi-modal technology, applied in the field of autonomous system perception methods and systems, can solve problems such as unstable working status of the perception system, achieve the effects of solving the limitation of vision, reducing uncertainty, and improving performance

Pending Publication Date: 2022-01-28
西安电子科技大学广州研究院
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] The embodiment of the present application provides an autonomous system perception method and system based on multi-modal fusion, aiming to solve the problem of unstable working status of the existing perception system

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Autonomous system sensing method and system based on multi-modal fusion
  • Autonomous system sensing method and system based on multi-modal fusion
  • Autonomous system sensing method and system based on multi-modal fusion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] The specific embodiments of the present invention will be further described below in conjunction with the accompanying drawings. It should be noted here that the descriptions of these embodiments are used to help understand the present invention, but are not intended to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below may be combined with each other as long as they do not constitute a conflict with each other.

[0025] With the development of artificial intelligence such as machine learning and deep learning, a large amount of data can be processed and a relatively high-precision model can be established. In fact, no sensor can well reflect the real world situation. Therefore, this application combines different sensors or different views of the same sensor to reconstruct the real situation, which is then fed into a deep learning model.

[0026] see figure 1 The embodiment sho...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an autonomous system sensing method and system based on multi-modal fusion, and the method comprises the steps: obtaining data collected by sensors of a plurality of vehicles, processing the data collected by the sensors through employing a trained multi-modal data fusion frame, fusing the processed results, obtaining a primary fusion result, processing the primary fusion result based on a target recognition service, retaining region-of-interest information, constructing a cooperation-level fusion architecture by using an existing multi-view fusion algorithm, performing secondary fusion on the region-of-interest information uploaded by a plurality of vehicles to obtain a secondary fusion result, and issuing the secondary fusion result to the vehicles so that the vehicles can make control decisions according to the secondary fusion result. According to the method and system, the vehicle cascade cooperation data fusion framework is constructed, the environment view can be improved and expanded through the cascade framework, the view limitation of a single automatic driving vehicle is solved, the uncertainty is reduced, and the performance of a sensing algorithm is improved.

Description

technical field [0001] The invention relates to the technical field of automobile automatic driving, in particular to an autonomous system perception method and system based on multimodal fusion. Background technique [0002] With the rapid development of sensor technology, self-driving cars are becoming more and more popular. In the process of self-driving vehicles perceiving the surrounding environment, it is necessary to solve the limitation of the field of view of a single self-driving vehicle under the requirement of ensuring its real-time performance and perception accuracy. , one of the basic challenges of computer vision for self-driving cars is to achieve high-precision various applications under different road, weather, lighting and working conditions, but the two-dimensional computer vision based on existing cameras is greatly affected by environmental conditions, It is easy to cause unstable working status of related applications of self-driving cars. [0003] T...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06V10/80G06V10/82G06N3/04G06N3/08
CPCG06F18/251
Inventor 杨清海沈八中徐丽娟梅牧雨张媛
Owner 西安电子科技大学广州研究院
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products