Word bag generation method and device based on feature matching

A feature matching and bag-of-words technology, applied in the computer field, can solve the problem of long time consumption and achieve the effect of reducing the time consumption

Active Publication Date: 2019-09-17
BEIHANG UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] When an intelligent robot performs a task in an unknown environment, it needs to use SLAM (Simultaneous Localization And Mapping, simultaneous positioning and mapping) technology to construct a map. Due to the accumulation of errors in the process of constructing the map, the constructed map is often not a closed map. In order to improve the accuracy of the constructed map, the constructed map can be corrected by using the closed-loop detection algorithm based on two-dimensional images. In the process of detecting closed-loops by the closed-loop detection algorithm based on two-dimensional images, it is necessary to construct a bag of words. In order to make the construction To produce a bag of words with high accuracy, it is necessary to match the constructed bag of words with the feature descriptors of the feature key points of the acquired continuous environmental image frames. In the prior art, different construction parameters are usually used to construct the bag of words. After obtaining multiple word bags, an optimal word bag is selected from the multiple word bags as the basis for correction through experiments. However, the above method requires repeated calculations to obtain the final word bag, resulting in the consumption longer time

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Word bag generation method and device based on feature matching
  • Word bag generation method and device based on feature matching
  • Word bag generation method and device based on feature matching

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0068] figure 1 A schematic flow diagram of a bag-of-words generation method based on feature matching provided in Embodiment 1 of the present application, as shown in figure 1 As shown, the bag-of-words generation method based on feature matching includes the following steps:

[0069] Step 101. Obtain a two-dimensional image video stream shot for a target environment, wherein the target environment is the environment corresponding to the map constructed by the target device when performing simultaneous localization and mapping SLAM.

[0070] Specifically, when an intelligent robot performs a task in an unknown environment, it needs to use SLAM technology to construct a map. Due to the accumulation of errors in the process of constructing the map, the constructed map is often not a closed map. At this time, it is necessary to continue to obtain the unknown environment. The two-dimensional image video stream of , so as to use the video stream to correct the constructed map.

...

Embodiment 2

[0113] Figure 4 A schematic structural diagram of a bag-of-words generation device based on feature matching provided in Embodiment 2 of the present application, as shown in Figure 4 Shown, this bag-of-words generating device based on feature matching comprises:

[0114] The acquisition unit 41 is configured to acquire a two-dimensional image video stream shot for a target environment, wherein the target environment is the environment corresponding to the map constructed by the target device during simultaneous localization and mapping SLAM;

[0115] The extraction unit 42 is configured to use a feature extraction algorithm to extract a feature descriptor of a target feature key point of each image frame in the two-dimensional image video stream, wherein the extracted two-dimensional image video stream of each image frame The number of target feature key points is at least one;

[0116] The matching unit 43 is configured to perform similarity matching on the feature descri...

Embodiment 3

[0141] Figure 6 A schematic structural diagram of an electronic device provided in Embodiment 3 of the present application, including: a processor 601, a storage medium 602, and a bus 603, and the storage medium 602 includes such as Figure 4 In the shown bag-of-words generation device based on feature matching, the storage medium 602 stores machine-readable instructions executable by the processor 601. When the electronic device runs the above-mentioned bag-of-words generation method based on feature matching, the The processor 601 communicates with the storage medium 602 through a bus 603, and the processor 601 executes the machine-readable instructions to perform the following steps:

[0142] Obtaining a two-dimensional image video stream shot for the target environment, wherein the target environment is the environment corresponding to the map constructed by the target device during simultaneous localization and mapping SLAM;

[0143] Using a feature extraction algorithm...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a word bag generation method and device based on feature matching, and the method comprises the steps: obtaining a two-dimensional image video stream which is shot for a target environment, wherein the target environment is an environment corresponding to a map constructed by target equipment during the simultaneous positioning and mapping of SLAM; extracting a feature descriptor of a target feature key point of each image frame in the two-dimensional image video stream by using a feature extraction algorithm, the number of the target feature key points of each image frame in the extracted two-dimensional image video stream being at least one; performing similarity matching on the feature descriptors of the target feature key points of the image frames in the two-dimensional image video stream according to the sequence of the image frames in the two-dimensional image video stream to obtain leaf nodes of the word bag; taking the number of the leaf nodes and the number of the leaf nodes as input parameters and inputting th input parameters into the text clustering model to construct the word bag, and by means of the method, the time consumed for obtaining the word bag can be shortened.

Description

technical field [0001] The present application relates to the field of computer technology, in particular, to a bag-of-words generation method and device based on feature matching. Background technique [0002] When an intelligent robot performs a task in an unknown environment, it needs to use SLAM (Simultaneous Localization And Mapping, simultaneous positioning and mapping) technology to construct a map. Due to the accumulation of errors in the process of constructing the map, the constructed map is often not a closed map. In order to improve the accuracy of the constructed map, the constructed map can be corrected by using the closed-loop detection algorithm based on two-dimensional images. In the process of detecting closed-loops by the closed-loop detection algorithm based on two-dimensional images, it is necessary to construct a bag of words. In order to make the construction To produce a bag of words with high accuracy, it is necessary to match the constructed bag of ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06K9/00G06K9/46
CPCG06V20/40G06V20/10G06V10/464
Inventor 岳昊嵩苗津毓于跃吴星明陈伟海
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products