Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Feature Point Extraction and Matching Method and System in Simultaneous Positioning and Mapping Technology

A feature point extraction and matching method technology, applied in the information field, can solve problems such as the speed of the algorithm running, and achieve the effect of uniform constraints and accurate camera poses

Active Publication Date: 2021-11-16
PEKING UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

For the traditional motion restoration structure algorithm, since there is no prior information of feature point matching, it is necessary to traverse the image feature points when solving the correspondence between a 3D point and image feature points, which will naturally seriously affect the speed of the algorithm.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Feature Point Extraction and Matching Method and System in Simultaneous Positioning and Mapping Technology
  • Feature Point Extraction and Matching Method and System in Simultaneous Positioning and Mapping Technology
  • Feature Point Extraction and Matching Method and System in Simultaneous Positioning and Mapping Technology

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0041] The present invention will be described in further detail below through specific embodiments and accompanying drawings.

[0042] This embodiment provides a feature point extraction and matching method in simultaneous positioning and mapping technology, the process is as follows figure 1 shown, including the following steps:

[0043] 1) Perform the following operations on each key frame image extracted from the video sequence:

[0044] 2) If the key frame image is the first frame of the video, then extract n FAST corner points on the image (the specific steps of the extraction method of FAST corner points are discussed in detail in the pyramid-based corner point detection method in the following text), and Select the FAST corner point with the largest k before the response value to extract the BRIEF feature, and k<n;

[0045] 3) Otherwise (that is, the key frame image is not the first frame of the video) perform the following sub-steps:

[0046] 3.1) Extract k FAST co...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a feature point extraction and matching method and system in simultaneous positioning and mapping technology. This method extracts n FAST corner points from the first frame of the key frame image in the video, and selects the FAST corner points with the k largest response value to extract the BRIEF feature; for each frame from the second frame to the last frame, extract k The k FAST corner points are used to extract the BRIEF features, and the k FAST corner points are used as feature points to match the 3D point cloud reconstructed from the previous frame or the feature points extracted from the previous frame to calculate the camera pose. This method replaces part of the feature point matching process with optical flow to improve the running speed, and uses the optical flow method to track the corner points that have not extracted the BRIEF feature, saving the time for feature point extraction descriptors. The present invention can facilitate the reconstruction of denser point clouds while ensuring the extraction speed of feature points, and makes the distribution of feature points more uniform.

Description

technical field [0001] The invention belongs to the field of information technology, and in particular relates to a feature point extraction and matching method and system in simultaneous positioning and mapping technology. Background technique [0002] Simultaneous localization and mapping (SLAM) technology, as an algorithm for fast mapping and localization of unknown scenes, has attracted extensive attention from academia and industry in recent years. This technology takes the captured video as input, and can reconstruct the 3D point cloud and camera parameters of the video scene in real time or near real time. With the development of industries such as autonomous driving and augmented reality, the application of this technology is becoming more and more widespread. This technology uses the relationship between the front and back frames of the video to effectively increase the efficiency of feature point matching, and the efficiency of the traditional motion restoration s...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T7/246G06T7/269G06T7/73
CPCG06T2207/10021G06T2207/20164G06T7/246G06T7/269G06T7/73
Inventor 李胜蒙力陈毅松汪国平盖孟
Owner PEKING UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products