Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Autonomous mobile platform environment perception and mapping method based on bionics

An autonomous mobile and environment-aware technology, applied in biological models, image enhancement, image analysis, etc., can solve problems such as consuming huge computing power and modeling results that cannot be used for semantic analysis and cognitive understanding of navigation tasks

Pending Publication Date: 2021-03-16
SOUTH CHINA UNIV OF TECH
View PDF5 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when building a model in a real environment with complex environmental characteristics, it needs to consume a huge amount of computing power.
And the resulting modeling results cannot be used for deeper navigation tasks such as semantic analysis and cognitive understanding

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Autonomous mobile platform environment perception and mapping method based on bionics
  • Autonomous mobile platform environment perception and mapping method based on bionics
  • Autonomous mobile platform environment perception and mapping method based on bionics

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0081] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings in the embodiments of the present invention. Obviously, the described embodiments It is a part of embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0082] Such as figure 1 , figure 2 A method for environment perception and mapping of an autonomous mobile platform based on bionics, comprising the following steps:

[0083] Step S1, collect the automatic driving data set, send the picture to the improved semantic segmentation network, and train the improved semantic segmentation network;...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an autonomous mobile platform environment perception and mapping method based on bionics, and the method comprises the following steps: S1, collecting an automatic driving dataset, transmitting a picture to an improved semantic segmentation network, and carrying out the training of the improved semantic segmentation network; S2, initializing the combination of binocular vision and IMU; S3, fusing the binocular vision and the IMU; S4, constructing a bionic cell model; S5, constructing a cognitive map. An empirical map is established by using a bionics principle, so thatthe parameter quantity in the map is greatly reduced, and scene map information with a relatively large range can be stored. And meanwhile, modeling site information is semantically segmented and fused with position cells, so that the detection robustness can be improved. The strong feature extraction and learning capacity of the convolutional neural network is fully utilized, the problems that atraditional SLAM method is too large in mapping calculation amount and inaccurate in recognition can be solved, and the accuracy of location judgment in the bionic environment perception and mappingprocess is well improved.

Description

technical field [0001] The invention belongs to the technical field of positioning and navigation of an autonomous mobile platform, and in particular relates to a bionics-based environment perception and mapping method for an autonomous mobile platform. Background technique [0002] Simultaneous Localization and Mapping (SLAM) refers to a technology in which a robot actively creates a map in an unknown environment and performs self-positioning based on the estimation of its own state and the map. Accurate positioning information helps autonomous mobile platforms to complete tasks such as path planning and map drawing. Traditional real-time positioning and map construction algorithms use a method based on Bayesian probability estimation to calculate velocity and pose by collecting external input information. And use complex nonlinear optimization knowledge to solve the prior information in the environment. Build a huge database as a search source. However, when building a ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/73G06T7/10G06F16/587G06N3/04G06N3/00
CPCG06T7/73G06T7/10G06F16/587G06N3/002G06T2207/30252G06N3/045
Inventor 王博吴忻生陈安杨璞光刘丞陈纯玉
Owner SOUTH CHINA UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products