Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A Pose and Trajectory Estimation Method Based on Image Interpolation

A pose estimation and image technology, applied in the field of computer vision, can solve the problems of low recognition accuracy, achieve the effects of improving recognition accuracy, reducing tracking loss, and improving accuracy

Active Publication Date: 2022-08-05
NANJING UNIV OF POSTS & TELECOMM
View PDF3 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In order to overcome the deficiencies of the above-mentioned prior art, the present invention provides a pose trajectory estimation method based on the image interpolation method, which solves the problem in the prior art that the recognition accuracy of visual odometry for pose trajectory estimation based on captured images is not high

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A Pose and Trajectory Estimation Method Based on Image Interpolation
  • A Pose and Trajectory Estimation Method Based on Image Interpolation
  • A Pose and Trajectory Estimation Method Based on Image Interpolation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] Specific implementations of the present invention are described in detail below, examples of which are illustrated in the accompanying drawings, wherein the same or similar reference numerals represent the same or similar elements or elements with the same or similar functions throughout. The embodiments described below with reference to the accompanying drawings are exemplary, and are intended to explain the present invention and should not be construed as limiting the present invention.

[0027] 1 image inset

[0028] When capturing images, it is inevitable that the camera moves too fast or the frame rate of the camera is too low, resulting in too little common view between two adjacent frames. Therefore, the video frame interpolation technology can effectively solve this problem.

[0029] As a preferred solution, a robust video frame interpolation method is used that utilizes deep convolutional neural networks to achieve frame interpolation without explicitly splitti...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a pose trajectory estimation method based on an image frame insertion method, inserting a new frame between two image frames, taking semantics as a representation of an invariant scene, and jointly constraining the pose trajectory estimation with feature points, By increasing the number of feature point matching between frames, reducing tracking loss, and fusing semantic information to reduce the influence of dynamic object feature points and constrain the matching of feature points, the accuracy of pose estimation and trajectory estimation is improved. Experiments on public datasets show that this method maintains high accuracy, is robust to moving objects and sparse textures, and achieves good results in improving visual odometry recognition accuracy.

Description

technical field [0001] The invention relates to the technical field of computer vision, in particular to a method for estimating pose and trajectory based on an image frame interpolation method. Background technique [0002] The goal of visual odometry is to estimate the motion of the camera according to the captured image. There are two commonly used methods, namely the feature point method and the direct method. Among them, the feature point method is currently the mainstream, and can achieve good results in places where the camera moves quickly, the illumination changes are not obvious, and the environment is diverse. ; direct methods do not require feature extraction, but are not suitable for environments with fast camera motion. At the heart of visual odometry is the problem of data association, as it establishes pixel-level associations between images. These corresponding pixels are used to construct a 3D map of the scene and track the current camera pose. This loca...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06V10/75G06V10/80G06V10/82G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V10/751G06N3/045G06F18/25
Inventor 梁志伟郭强周鼎宇
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products