Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Feature matching method based on attention mechanism and neighborhood consistency

A feature matching and attention technology, applied in the field of computer vision based on deep learning, can solve problems such as lack of neighborhood consistency, achieve wide application prospects, improve matching quality, and improve the effect of square space and time complexity

Pending Publication Date: 2022-07-15
SOUTHEAST UNIV +1
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

This is an effective feature matching structure, but it will lead to the loss of neighborhood consistency when applying the attention mechanism to the feature matching task, because the attention network can be regarded as a graph neural network with all nodes fully connected

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Feature matching method based on attention mechanism and neighborhood consistency
  • Feature matching method based on attention mechanism and neighborhood consistency
  • Feature matching method based on attention mechanism and neighborhood consistency

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] A feature matching method based on attention mechanism and neighborhood consistency proposed by the present invention will be described in detail below with reference to the accompanying drawings:

[0040]Step 1. Input a single image, perform random homography transformation on the input image and generate a homography matrix, and obtain two images of the original image and the transformed image of the input network, as well as the groundtruth homography matrix used to supervise network training. Use the SuperPoint deep convolutional network to extract key points and descriptors from the two images, and obtain the input key point coordinates p and 256-dimensional descriptor d based on the attention mechanism and the neighborhood consistency model, assuming that images A and B each have M and If there are N key points, the dimensions of the key points p of the two images are (M, 2) and (N, 2) respectively, and the dimensions of the descriptor d are (M, 256) and (N, 256) r...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a feature matching method based on an attention mechanism and neighborhood consistency. The method comprises the following steps: extracting a picture key point and a descriptor thereof by using a convolutional neural network; key point positions are coded, and position information is fused into descriptors; using a self-attention network and a cross-attention network to enhance the characterization capability of the descriptors and searching a corresponding relation between the descriptors of the two pictures; constructing a graph by using a k-nearest neighbor algorithm according to the attention weight, evaluating neighborhood consistency of nodes by using a graph neural network, and fusing consistency information into attention aggregation; and according to the enhanced descriptor, calculating a matching result by using a Sinkhorn algorithm and mutual neighbor screening. The invention provides a method for integrating neighborhood consistency into an attention mechanism, the neighborhood consistency in the attention aggregation process is dynamically evaluated while global context information is aggregated by the attention mechanism, and local information and global information are combined to jointly enhance the characterization capability of a descriptor, so that higher-quality matching is obtained.

Description

technical field [0001] The invention relates to a feature matching method and device technology based on attention mechanism and neighborhood consistency, and belongs to the technical field of computer vision based on deep learning. Background technique [0002] Feature matching refers to finding the correct correspondence from points and features of two images, and is a key link in many 3D computer vision tasks, such as structure from motion, simultaneous localization and mapping. Getting the right match makes a good foundation for these upstream tasks, but occlusion, blurring, similar repetitive textures, lighting and perspective changes can make it extremely challenging. Therefore, how to find the information hidden in the descriptors, such as context information and neighborhood consistency information, to eliminate interference is one of the most important processes in feature matching. [0003] The classic feature matching process generally consists of four parts: (1)...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06V10/46G06V10/764G06V10/82G06N3/04G06N3/08
CPCG06N3/084G06N3/045G06N3/044G06F18/2415
Inventor 杜松林芦晓勇
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products