Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A method and a device for performing 3D fusion display on a real scene and a virtual object

A technology of real scenes and virtual objects, applied in the input/output process of data processing, input/output of user/computer interaction, image data processing, etc. and other problems, to achieve the effect of good fusion effect, strong authenticity and good user experience

Inactive Publication Date: 2019-04-09
SUPERD CO LTD
View PDF0 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] Most of the current technologies use a pair of images captured by a monocular camera or two images captured by a binocular camera, and add some virtual special effects to the acquired images for virtual and real fusion, but the final effect of this synthesis is only added to the real scene. The 2D image of the virtual object does not present the three-dimensional effect of fusion display to the observer
[0004] Another fusion display technology is to place the two-dimensional markers trained in advance in the real scene to be previewed, and draw the virtual object at the image position corresponding to the two-dimensional markers by detecting the posture of the two-dimensional markers. This method has great limitations, because for many applications that require high real-time performance, it is impossible to perform lengthy marker training operations in advance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A method and a device for performing 3D fusion display on a real scene and a virtual object
  • A method and a device for performing 3D fusion display on a real scene and a virtual object
  • A method and a device for performing 3D fusion display on a real scene and a virtual object

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0061] like figure 1 and figure 2 It is a flow chart of a method for 3D fusion displaying a real scene and a virtual object provided by Embodiment 1 of the present invention. like figure 1 As shown, the method includes the following steps:

[0062] In step A, two 2D left and right eye images of the real scene collected by the binocular camera in real time are acquired.

[0063] The fusion display method in this embodiment can be applied to a terminal with a binocular camera, for example, a smart phone, etc., and uses the binocular camera to collect real scene information in real time, wherein the binocular camera includes a left camera and a right camera. When shooting is required, the binocular camera is used to collect real scene information in real time, and the real scene information collected in real time includes the left-eye image frame captured by the left camera and the right-eye image frame captured by the right camera.

[0064] For each image frame of the left-...

Embodiment 2

[0108] image 3 and Figure 4 It is a schematic diagram of a device for 3D fusion displaying a real scene and a virtual object provided in Embodiment 2 of the present invention, and the device includes:

[0109] Obtaining module 31, obtaining two 2D left and right eye image frames of the real scene collected by the binocular camera in real time;

[0110] The determining module 32 is configured to determine the first target mark position of the three-dimensional virtual object on the image frame, and obtain the depth value d of the first target mark position;

[0111] The calculation module 33, according to the depth value d of the first target mark position, determines the world space vectors of all target mark positions of the three-dimensional virtual object on the image screen;

[0112] Transformation module 34, according to the internal reference matrix M of the binocular camera in , transforming the world space vectors of all target marker positions to obtain a target ...

Embodiment 3

[0134] An embodiment of the present invention also provides an electronic device, including at least one processor; and,

[0135] a memory communicatively coupled to the at least one processor; wherein,

[0136] The memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor, so that the at least one processor can execute the method described in Embodiment 1 above.

[0137] For the specific execution process of the processor, reference may be made to the description of Embodiment 1 of the present invention, and details are not repeated here.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method and a device for carrying out 3D fusion display on a real scene and a virtual object. The method comprises the steps: obtaining two 2D left-eye and right-eye image images, collected by a binocular camera in real time, of a real scene; Determining a first target mark position of the three-dimensional virtual object on the image picture, and obtaining a depth value of the first target mark position; Determining world space vectors of all the target mark positions of the three-dimensional virtual object on the image according to the depth value of the first targetmark position; Transforming the world space vectors of all the target mark positions according to the internal parameter matrix and the external parameter matrix of the binocular camera to obtain a target mark clipping space position vector corresponding to each target mark position; Cutting a space position vector according to the two 2D left-right eye image images and the target mark corresponding to each target mark position to obtain two 2D left-right eye virtual-real fusion images after virtual-real fusion; And performing three-dimensional display according to the two 2D left-right eye virtual-real fusion images.

Description

technical field [0001] The present invention relates to Augmented Reality (Augmented Reality, AR) and 3D display technology, in particular to a method and device for 3D fusion display of a real scene and a virtual object. Background technique [0002] In the field of augmented reality, how to more realistically integrate the real scene picture previewed by the camera with the virtual object, and make the observer feel that the fused virtual object is fused in the real scene has always been an issue. research problems in the field. [0003] Most of the current technologies use a pair of images captured by a monocular camera or two images captured by a binocular camera, and add some virtual special effects to the acquired images for virtual and real fusion, but the final effect of this synthesis is only added to the real scene. The 2D image of the virtual object cannot present the three-dimensional effect of fusion display to the observer. [0004] Another fusion display tec...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T19/00G06F3/01
CPCG06T19/006G06F3/011
Inventor 范福鼎李晓鸣
Owner SUPERD CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products