Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Indoor scene positioning method based on hybrid camera

An indoor scene and localization method technology, which is applied in the field of indoor scene localization based on hybrid cameras, can solve the problems such as the scalability of the restriction method and the difficulty of calculating the similarity of key frames, and achieve the effect of improving the accuracy and efficiency.

Active Publication Date: 2015-04-15
ZHEJIANG UNIV
View PDF3 Cites 15 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The disadvantage is that each key frame usually uses operations such as downsampling and brightness information normalization when encoding. As the number of key frames increases, the similarity calculation between key frames will become more and more difficult.
Moreover, for such scene recognition algorithms based on global image matching, the recognition results can only depend on a limited number of camera poses stored in the database, which seriously restricts the scalability of this type of method.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Indoor scene positioning method based on hybrid camera
  • Indoor scene positioning method based on hybrid camera
  • Indoor scene positioning method based on hybrid camera

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] In order to describe the present invention more specifically, the technical solutions of the present invention will be described in detail below in conjunction with the accompanying drawings and specific embodiments.

[0028] Such as figure 1 As shown, the present invention is based on the hybrid camera indoor scene positioning method, comprising the following steps:

[0029] (1) Use the RGB-D hybrid camera to shoot indoor scenes, and obtain one RGB image sequence and one depth image sequence;

[0030] (2) Extract the depth information of each pixel in each frame of the depth image sequence, generate a 3D point cloud image of the indoor scene in real time and calculate the hybrid camera parameters in real time;

[0031] In this embodiment, on the basis of the traditional ICP (Iterative Closest Point) algorithm to estimate the camera pose, the ICP algorithm is optimized, mainly including camera pose motion compensation and weighted ICP point cloud registration. The ICP...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an indoor scene positioning method based on a hybrid camera. The indoor scene positioning method includes taking a depth image and a color image of an indoor scene by the hybrid camera; tracking position of the camera; utilizing a standard greed forest algorithm to train the depth image and the color image of the indoor scene to build a regression forest; when positioning the indoor scene, utilizing the depth image and the color image of a current frame photographed by the hybrid camera and the regression forest which is trained well to calculate world coordinates corresponding to the current camera to complete positioning. In the indoor scene positioning method, the hybrid camera is adopted for photographing, a regression forest algorithm is adopted to train each pixel point in a known scene, and sparse feature points or dense feature points do not need to be used for scene positioning, so that time for detecting, describing and matching the feature points is saved, camera positioning is enabled to have nothing to do with state, accumulated errors in camera tracking are avoided, and accuracy and efficiency of scene positioning are improved effectively.

Description

technical field [0001] The invention belongs to the technical field of indoor scene positioning, and in particular relates to an indoor scene positioning method based on a hybrid camera. Background technique [0002] With the rapid growth of digital image data, it is becoming more and more urgent for computers to automatically understand images. As an important research content of image understanding, scene localization has received extensive attention and is currently a research hotspot in the field of computer vision. Most of the current mainstream scene localization methods are based on color images. [0003] The scene localization methods based on color images can be roughly divided into two categories: the first one is the scene localization method based on sparse feature point matching. The strategy adopted by this type of algorithm is to extract a certain number of key points from the scene image when the camera is tracking normally, describe the features of these ke...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/00
CPCG06T2207/30244
Inventor 李阳王梁昊李东晓张明
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products