Visual slam closed-loop detection method based on feature extraction and dimensionality reduction neural network

A feature extraction and neural network technology, applied in the field of robot vision and mobile robots, can solve the problems of dynamic changes in the environment and the influence of light changes

Active Publication Date: 2022-05-17
BEIJING UNIV OF TECH
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Aiming at the problem that the traditional closed-loop detection method is easily affected by environmental dynamic changes and illumination changes, the present invention uses a convolutional neural network model and trains on a large number of data sets, so that the network has the ability to learn features

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Visual slam closed-loop detection method based on feature extraction and dimensionality reduction neural network
  • Visual slam closed-loop detection method based on feature extraction and dimensionality reduction neural network
  • Visual slam closed-loop detection method based on feature extraction and dimensionality reduction neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0060] The first step is to build a network model. use figure 1 The shown Base-Block unit, pooling layer and softmax classification layer construct a convolutional neural network for classification, and the classification network is obtained as Figure 4 shown. The specific implementation is written using the open source deep learning framework TensorFlow.

[0061] In the second step, train the convolutional neural network for classification built in the first step. The network is trained using the Places205 scene classification dataset, which contains 205 categories of scenes. The loss function of the network looks like this:

[0062]

[0063] The update strategy of network weights adopts Adam algorithm:

[0064] g t =▽ θ loss t (θ t-1 )

[0065] m t = β 1 m t-1 +(1-β 1 )g t

[0066]

[0067]

[0068]

[0069]

[0070] The parameters are set to: β 1 =0.9,β 2 = 0.999, ε = 10 -8 . Set t=0 in the initial iteration, m 0 = 0, v 0 = 0, the initi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses a visual SLAM closed-loop detection method based on feature extraction and dimensionality reduction neural network. The invention adopts a convolutional neural network model and trains on a large number of data sets, so that the network has the ability of feature learning. In this way, the similarity comparison between pictures is converted into the similarity comparison between feature vectors. In order to further improve the speed of detection, a layer of autoencoder network is added at the end of the convolutional neural network to reduce the dimensionality of the extracted image features. The convolutional neural network has many characteristics such as translation invariance and scale invariance, which can effectively overcome the shortcomings of traditional artificial features that are sensitive to environmental changes, and has faster feature extraction speed. This method can solve the shortcomings of the traditional visual SLAM closed-loop detection method that the feature extraction time is short and is greatly affected by environmental changes and illumination changes. It can effectively improve the accuracy and recall of closed-loop detection, and plays an important role in building a globally consistent environment map. .

Description

technical field [0001] The invention belongs to the loop closure detection (Loop Closure Detection) method in the visual simultaneous positioning and map construction algorithm (Visual Simultaneous Localization and Mapping, VSLAM) in the field of mobile robots, and belongs to the technical field of robot vision. Background technique [0002] With the rapid development of artificial intelligence technology in recent years, the closely related robot technology has also made great progress. Among them, mobile robot is the key research direction in the field of robotics. Realizing the navigation of the robot in the unknown environment is the key basis for realizing the autonomous movement of the robot. After long-term research, researchers have explored a general algorithm framework for solving this problem, that is, simultaneous positioning and map construction. According to the different sensors used, it can be divided into simultaneous positioning and map construction using ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Patents(China)
IPC IPC(8): G01C25/00G06N3/04G06N3/08
CPCG06N3/08G01C25/00G06N3/045
Inventor 阮晓钢王飞黄静朱晓庆周静张晶晶董鹏飞
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products