Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

FAST feature homogenization extraction and IMU-based inter-frame feature mismatching removal method

An extraction method and feature extraction technology, applied in the fields of image processing and computer vision, can solve problems such as inability to use well, inability to accurately judge feature mismatches, and reduce the accuracy of feature matching.

Active Publication Date: 2020-05-15
SOUTHEAST UNIV
View PDF5 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

These methods are complex to calculate and consume too long calculation time, so they cannot be well used in some applications with real-time requirements.
[0010] In summary, the feature mismatch removal method described above depends on the quality of the image features. When the image feature matching contains many feature mismatches, the feature mismatch cannot be accurately judged, which reduces the accuracy of feature matching.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • FAST feature homogenization extraction and IMU-based inter-frame feature mismatching removal method
  • FAST feature homogenization extraction and IMU-based inter-frame feature mismatching removal method
  • FAST feature homogenization extraction and IMU-based inter-frame feature mismatching removal method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment

[0100] An important application field of the method of the present invention includes but is not limited to monocular visual inertial SLAM. Its working principle is the image acquired by the monocular camera, the acceleration and angular velocity acquired by the IMU sensor, and after a series of calculations, the world coordinates of the object are output in real time. The translation and rotation of the system. As an important part of image data processing, the method of the present invention is responsible for extracting feature points in the image and outputting feature matching between adjacent images for subsequent calculations.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a FAST feature homogenization extraction and IMU-based inter-frame feature mismatching removal method. Firstly, a FAST feature homogenization extraction method is adopted to obtain uniform and quality FAST feature points as much as possible in a whole image range; extra calculation overhead is not generated; an IMU-based inter-frame feature mismatching removal method uses IMU sensor data (acceleration and angular velocity) to calculate translation and rotation transformation between adjacent images so as to derive a basic matrix model and judge: when the camera is static, feature mismatching is judged and removed according to the motion vector length of feature matching; and when the camera moves, judging and removing feature mismatching according to whether featurematching conforms to the basic matrix model or not.

Description

Technical field [0001] The invention discloses a FAST feature homogenization extraction and an IMU-based inter-frame feature mismatch removal method, which belongs to the technical field of computer vision and image processing. Background technique [0002] The feature extraction method is to select some easy-to-observe parts in the image as features. Among them, the corner feature has better recognition and the calculation is simpler. The following describes the existing corner feature extraction method: [0003] For each pixel in the image, the Harris method calculates the grayscale change of the pixel in the neighborhood of the point on the horizontal axis and the vertical axis. If the grayscale changes in both directions are obvious, determine that the pixel is a corner point. Shi and Tomasi proposed an improved method based on Harris in 1994, that is, the GFT (Good Feature to Track) method, which improves the corner judgment condition, that is, the gray change needs to be grea...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/46G06K9/62
CPCG06V10/443G06V10/513G06F18/22
Inventor 齐志周珊珊刘昊时龙兴
Owner SOUTHEAST UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products