Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Self-adaptive indoor vision positioning method based on global motion estimation

A global motion and visual positioning technology, applied in the field of image processing, can solve the problems of difficult positioning, large amount of calculation, slow parameter estimation speed, etc., and achieve the effect of improving effectiveness, improving positioning accuracy, and shortening execution time

Inactive Publication Date: 2014-09-03
BEIJING UNIV OF TECH
View PDF2 Cites 28 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In recent years, for many global motion estimation methods, one of the main problems is that the amount of calculation is large, resulting in slow parameter estimation, thus limiting their application, and it is difficult to meet the needs of positioning. Improving real-time performance has become a primary issue.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Self-adaptive indoor vision positioning method based on global motion estimation
  • Self-adaptive indoor vision positioning method based on global motion estimation
  • Self-adaptive indoor vision positioning method based on global motion estimation

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0035] The present invention will be further described below in conjunction with the accompanying drawings and embodiments.

[0036] The wired camera used in the experiment has a pixel size of 640×480, a resolution of 96dpi, and a video frame rate of 30fps.

[0037] The flowchart of the method of the present invention is as figure 1 , the specific implementation process is as follows:

[0038] (1) The camera collects ground image information, and extracts two images from the zeroth frame image of the sequence, the zeroth frame is the reference frame, and the other image is the current frame. The image is preprocessed by adaptive smoothing filter, and then the color image is converted into a grayscale image.

[0039] (2) Before the Kalman filtering algorithm can accurately predict the overlapping area, M matchings are required. If the number of times the two images are successfully matched is less than M times, perform step 3 on the original image; if it is more than M time...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a self-adaptive indoor vision positioning method based on global motion estimation. The self-adaptive indoor vision positioning method comprises the following steps of: firstly, extracting and matching feature points based on a modified MIC (Minimum Intensity Change) algorithm and a modified SURF (Speed Up Robust Features) algorithm, and estimating an overlapping region of matching images at each time by applying Kalman filtering; secondly, only detecting and matching the feature points on the overlapping region, figuring up offsets of two continuous images in a sequence image by utilizing a global motion estimation method; and lastly, estimating a displacement of a camera by utilizing a six-parameter affine model according to a matching result, drawing a real trace of the camera on an interface of an upper computer in a real-time manner, and correcting the drawn trace according to set wireless beacons. According to the self-adaptive indoor vision positioning method, angular points are detected, described and matched on the overlapping region other than the whole image, the modified MIC algorithm is utilized to detect, so that the effectiveness of extraction of the feature points, the estimated accuracy of the model and the execution speed are improved. The wireless beacons are adopted to correct position information of the camera, so that the positioning accuracy is improved.

Description

technical field [0001] The invention belongs to the field of image processing. It relates to a method for realizing indoor visual positioning by using computer technology, digital image processing technology and optical technology. It specifically relates to an adaptive indoor visual positioning method based on global motion estimation, which judges the displacement size and direction of the camera by automatically analyzing the video images captured by the camera. Background technique [0002] Indoor is the place with the most intensive human activities, so the research on indoor location-based services is of great significance. Due to the characteristics of multipath, non-line-of-sight, complex and changeable indoor environment, the GPS signal is very weak or even unreceivable, so GPS and other widely used outdoor positioning technologies cannot be applied indoors. Vision-based indoor positioning technology has become the focus of indoor positioning technology research i...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04N17/00G06T7/20G01C11/00
Inventor 张会清张敬丽
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products