Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Dual-camera video fusion distortion correction and viewpoint micro-adjustment method and system thereof

A technology of distortion correction and video fusion, which is applied in the field of image processing, can solve problems such as inability to achieve virtual-real fusion, different, and different head-mounted devices, and achieve the effect of virtual-real consistency

Active Publication Date: 2017-11-24
ZHONGKE HENGYUN CO LTD
View PDF4 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

At present, there are several problems: first, the angle of view seen by the human eye and the angle of view of the head-mounted device are different, and the width of the field of view is also different; equipment is also different
It is always impossible to associate with reality, which affects MR equipment users to rely on visual and intuitive judgments to operate real equipment, and cannot achieve the purpose of virtual and real fusion

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Dual-camera video fusion distortion correction and viewpoint micro-adjustment method and system thereof
  • Dual-camera video fusion distortion correction and viewpoint micro-adjustment method and system thereof
  • Dual-camera video fusion distortion correction and viewpoint micro-adjustment method and system thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0044] Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals designate the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary and are intended to explain the present invention and should not be construed as limiting the present invention.

[0045] like figure 1 and figure 2 As shown, the dual-camera video fusion distortion correction and viewpoint fine-tuning method in the embodiment of the present invention includes the following steps:

[0046] Step S1, making an isometric model, using a single color with high contrast, and marking points.

[0047] Step S2, installing two environmental cameras above the head-mounted device, using the environmental cameras to capture images and performing image fusion calibration.

[0048] Specifically, each environmental camera ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a dual-camera video fusion distortion correction and viewpoint micro-adjustment method and a system thereof. The method comprises the following steps of making an equal scale model, and using a single color of a large contrast to note a mark point; installing two environment cameras above a head-mounted device, using the environment cameras to shoot images and carrying out image fusion calibration; using a measuring tool to calibrate a position of a VR helmet relative to the environment cameras in a physical space, and according to an offset distance of the environment cameras and a head-mounted device viewpoint, calculating positions of the environment cameras in a world coordinate system; calibrating a video splicing parameter and a viewpoint micro-adjustment parameter; carrying out operation on each group of two frame images shot by the two environment cameras at the same time. In the invention, a displayed deviation problem of real equipment in the head-mounted device under different viewpoint micro-deviation conditions is solved, and deficiency and excess consistency is realized.

Description

technical field [0001] The present invention relates to the technical field of image processing, in particular to a dual-camera video fusion distortion correction and viewpoint fine adjustment method and system. Background technique [0002] In mixed reality technology (MR), how to seamlessly integrate the scene seen in the actual environment with the virtual scene, and conform to the perspective and field of view of the head-mounted device, as well as usage habits is a new technical difficulty. At present, there are several problems: first, the angle of view seen by human eyes and the angle of view of head-mounted devices are different, and the width of field of view is also different; second, the angle of view and field of view collected by a single camera are different from those of human eyes and head-mounted The type of equipment is also different. [0003] Generally, it is sufficient to use a wide-angle camera to obtain the field of view of the real scene in the left ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T3/40G06T7/80G06F3/01H04N5/225H04N5/232H04N5/265
CPCH04N5/265G06F3/012G06T3/4038G06T7/85H04N23/45H04N23/57H04N23/951
Inventor 钟秋发锡泊黄煦高晓光李晓阳
Owner ZHONGKE HENGYUN CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products