Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Hierarchical feature fusion method for multi-target detection of mobile robot

A mobile robot and target detection algorithm technology, which is applied in the field of environmental perception of mobile robots, can solve the problems of low detection ability and insufficient feature extraction, and achieve the effect of improving detection ability and efficiency

Pending Publication Date: 2021-02-05
BEIJING UNIV OF TECH
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Aiming at the problem that the existing technology only uses a single-scale convolution kernel to extract features that are not rich, and the detection ability of objects of different scales in the same scene is low, the present invention uses dilated convolutions with different expansion rates to simulate receptive fields of different sizes, thereby Features of different scales are extracted, and at the same time, a hierarchical feature fusion method is proposed to fuse features of different scales. Compared with the channel splicing methods of other methods, the hierarchical feature fusion method of the present invention makes different channels contain different The feature information of the scale can effectively improve the detection ability of the target detection algorithm for objects of different scales in the same scene, thereby improving the efficiency of intelligent robots to search for objects

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hierarchical feature fusion method for multi-target detection of mobile robot
  • Hierarchical feature fusion method for multi-target detection of mobile robot
  • Hierarchical feature fusion method for multi-target detection of mobile robot

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0029] Embodiments of the present invention will be described in further detail below in conjunction with the accompanying drawings.

[0030] as attached figure 1 As shown, the present invention is a layered feature fusion method for mobile robot multi-target detection, comprising the following steps:

[0031] Step 1: Initially obtain the feature map, and input the images in the dataset into the pre-trained improved VGG-16. The VGG-16 network structure is shown in Figure 2(a): it consists of 13 convolutional layers and 3 fully connected layers. The convolutional layers are Conv1_1, Conv1_2, Conv2_1, Conv2_2, Conv3_1, Conv3_2, Conv3_3, Conv4_1, Conv4_2 , Conv4_3, Conv5_1, Conv5_2, Conv5_3, the fully connected layers are FC6, FC7, FC8 in turn; the improved VGG-16 network structure is shown in Figure 2(b): change the FC6 and FC7 fully connected layers of the VGG-16 network to is the convolutional layer; the initially acquired feature map T described in step 1 1 is the output o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the field of environmental perception of mobile robots, in particular to a hierarchical feature fusion method for multi-target detection of a mobile robot, and aims to improvethe detection capability of a target detection algorithm for targets of different scales so as to improve the environmental perception capability of an intelligent robot. The method includes: inputting the images in the data set into a pre-trained improved VGG-16, and preliminarily obtaining a feature map; respectively inputting the preliminarily acquired feature maps into a cavity convolution pyramid structure which comprises three cavity convolution integral branches with different expansion rates and is used for matching targets with different scales acquired by a visual sensor when the robot moves; fusing the feature maps acquired by different branches in a layered superposition mode provided by the invention, so that all channels in the feature maps contain feature information of different scales; gradually convolution is carried out on the fused feature maps to obtain feature maps of different sizes; and finally, obtaining the category and the bounding box of the to-be-detectedobject.

Description

technical field [0001] The invention relates to the field of environment perception of mobile robots, in particular to a layered feature fusion method for multi-target detection of mobile robots. Background technique [0002] With the continuous expansion of the application range of intelligent robots in the home environment, people have put forward higher and higher requirements for the robot's environmental perception ability. In the process of the robot searching for objects, since there are often objects of different scales in the robot's visual sensor, and the existing target detection algorithms cannot detect these objects well, it is necessary to improve the target detection algorithm's ability to detect objects of different scales. Detection ability, thereby improving the environment perception ability of intelligent robots. [0003] In order to enhance the detection effect of the network on targets of different scales, many scholars have improved the two-stage targ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V2201/07G06N3/045G06F18/253
Inventor 杨金福袁帅李明爱王康
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products