Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Underwater vision SLAM (Simultaneous Localization and Mapping) method based on camera motion constraint and distortion correction

A camera motion and distortion correction technology, applied in computer parts, image analysis, image enhancement, etc., can solve problems such as feature point processing, feature point mismatch, feature matching error, etc., to improve quality and prevent feature point mismatch. , Eliminate the effect of underwater cumulative error

Pending Publication Date: 2022-07-29
YANGZHOU UNIV
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] (1) The visual real-time scene construction system is used in the underwater process. If the camera does not perform other motion constraints, it will cause certain environmental factors such as bubbles to be generated underwater, which will lead to a decrease in the quality of camera shooting, thereby affecting SLAM construction. image quality;
[0006] (2) In order to improve the shooting quality in the underwater environment, the existing method is to improve by air pre-calibration and underwater calibration, but this method cannot directly process the feature points, which will lead to mis-matching of feature points and affect The true accuracy of loopback detection affects the overall mapping
[0007] In view of the above existing problems, in 2017, Yang Pingle and others from the Industrial Technology Research Institute of Jiangsu University of Science and Technology proposed an image processing method in the underwater visual SLAM system (authorized announcement number: CN104574387B), by establishing an underwater imaging model and processing water Under the influence of environmental factors on camera imaging, it solves the problems of mismatching and matching efficiency in data association, and can better deal with the problem of feature matching caused by light refraction, scattering and absorption, but does not consider image distortion and restrictions on camera movement. It may cause feature matching errors and affect the effect of mapping
In 2019, Wu Xiaojun and others from the Shenzhen Graduate School of Harbin Institute of Technology proposed a vision-based underwater measurement method (notice number: CN105698767B), using the method of pre-calibration in the air and underwater camera calibration to realize the underwater measurement of the camera outside the water surface. Accurate measurement, but it does not solve the distortion of the feature points when the SLAM camera works underwater
In 2021, Zhang Fei of Jiangsu University of Science and Technology and others proposed a method for underwater autonomous navigation and positioning (authorized announcement number: CN106403953B), using the stochastic finite set method to model SLAM problems, which can describe map features more accurately Information, map feature observation information, clutter and other factors have improved the estimation accuracy of the traditional method for the number of map features and the location of map features in terms of map feature estimation, and only improved the estimation and calculation speed of the camera pose, but did not consider the underwater The important influence of the complex environment on the camera movement, and the distortion of the image and its feature points that may be generated underwater is not considered

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Underwater vision SLAM (Simultaneous Localization and Mapping) method based on camera motion constraint and distortion correction
  • Underwater vision SLAM (Simultaneous Localization and Mapping) method based on camera motion constraint and distortion correction
  • Underwater vision SLAM (Simultaneous Localization and Mapping) method based on camera motion constraint and distortion correction

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0039] like figure 1 An underwater vision SLAM method based on constrained camera motion and distortion correction is shown, including the following steps:

[0040] Step 1) sensor information acquisition;

[0041] A camera captures the visible light or electromagnetic spectrum as a picture and represents it as a digital image, that is, represented by a two-dimensional matrix with finite numerical values, called pixels. Sensor information reading in SLAM is mainly the reading and preprocessing of camera image information.

[0042] Step 2) Front-end visual odometry based on the feature point method of the monocular camera;

[0043] Front-end visual odometry based on feature points is the mainstream method of visual odometry. Its motion is stable, and it is not sensitive to illumination and dynamic objects. It is a relatively mature solution at present. Its work is mainly divided into two steps:

[0044] First, the feature points are extracted from the image, and the feature ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an underwater vision SLAM (Simultaneous Localization and Mapping) method based on constrained camera motion and distortion correction, which comprises the following steps of: creating an underwater speed optimization equation by utilizing a method of shooting adjacent frame number images and an integral camera motion path in a constrained manner, and carrying out local and integral optimization on the motion rate of an underwater camera; according to the method, the problem of bubbles caused by overlarge deviation of the movement rate of the underwater camera is solved, the quality of a shot image is ensured, and mapping of SLAM is facilitated; when the word bag is created, underwater distortion correction is performed on the shot image and the feature points thereof through a method of fusing radial distortion, tangential distortion and scattering coefficients, so that the problem of mismatching of the feature points during loopback detection can be effectively solved, underwater accumulated errors are effectively eliminated, and the real loopback rate of loopback detection is improved.

Description

technical field [0001] The invention relates to the field of synchronous positioning and map construction, in particular to an underwater vision SLAM method based on constrained camera motion and distortion correction. Background technique [0002] In recent years, as robotics has attracted more attention, the key technologies of SLAM have developed rapidly, and there are still some unsolved problems. [0003] At present, the existing visual SLAM methods can be divided into two categories: direct methods and feature point methods. Among them, the direct method is represented by the visual odometry (Direct Sparse Odometry, abbreviated "DSO") of the sparse direct method, and the feature point method is a monocular SLAM system based on sparse feature points (Oriented FAST and Rotated BRIEF-SLAM, abbreviated "ORB" -SLAM") as the representative. Although DSO and ORB-SLAM's public datasets on land have achieved good results, they are not ideal for testing in underwater environme...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/73G06T5/00G06V10/46G06V20/05
CPCG06T7/73G06V10/464G06V20/05G06T2207/30252G06T5/80
Inventor 孙进周威谢文涛汪和平马昊天雷震霆梁立
Owner YANGZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products