Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Model system and method for fusion of virtual scene and real scene

A technology of real scenes and virtual scenes, applied in the computer field, can solve problems such as object display overlap, destroy immersion effect, reduce priority and weight, etc., achieve strong compatibility, improve immersion effect, and improve coordination

Inactive Publication Date: 2017-01-25
深圳前海小橙网科技有限公司
View PDF6 Cites 31 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0010] First of all, the existing technology only realizes the fusion of real scene objects and virtual scene objects according to the spatial position relationship between objects on the basis of the consistency of imaging angle of view. The degree of influence does not always depend on the spatial position of these objects and factors; although some objects in the real scene are in the center of the user's perspective, these objects cannot give the user a sense of reality when the user is immersed in virtual reality. to the obvious perceptual impact; and some other objects and factors in the real environment, although they are in a secondary position in the spatial position under the user's perspective, are even invisible (such as the temperature and humidity of the real environment, or the above mentioned weather and lighting in the real environment), but it has a strong impact on user perception; therefore, from the perspective of realizing the immersion effect, in the process of integrating the real scene into the virtual scene, priority should be given to the real scene Objects and factors that have a greater impact on user perception, while reducing the priority and weight of objects and factors that have a relatively small impact on user perception
[0011] Secondly, in virtual reality games, movies, and various simulation trainings, the objects and content in the virtual scene are mainly used to display the storyline and guide user interaction. Destroying the immersion effect; therefore, when the real scene contains many or complex objects and factors, if too many and too complex real scene objects are integrated, it will weaken the user's focus on the virtual scene. Among the scene objects, too many real scene objects will also occupy the display space of the virtual scene, and may even cause display overlap between objects
[0012] Third, no matter whether the real scene images are directly superimposed, or the objects in the real scene are modeled and imaged, in the final display image, the objects from the real scene and the original objects in the virtual scene are likely to have different visual effects. compatible condition
For example, if the virtual scene shows the furnishing of an ancient castle room, but the real scene objects to be integrated are all kinds of modern items, such as mobile phones, etc., it will inevitably seriously damage the original immersion effect.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Model system and method for fusion of virtual scene and real scene
  • Model system and method for fusion of virtual scene and real scene
  • Model system and method for fusion of virtual scene and real scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] The technical solutions of the present invention will be further specifically described below through examples.

[0052] see figure 1 As shown, the present invention provides a model system for the fusion of virtual scene and real scene. The specific structure and function of the system will be introduced below.

[0053] The real scene shooting unit 1001 is configured to shoot a real scene image at a shooting angle close to the user's point of view, and provide parameters representing the shooting angle of the real scene image. The real scene shooting unit 1001 may include at least one camera, which is arranged near the user's actual position (such as the user's seat) in the virtual reality picture display system (which may be a 2D or 3D display system), and the camera is set to execute the picture according to the user's perspective. Shooting angle of view, the user angle of view mentioned here refers to the range of vision of the user's eyes when observing the virtua...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a model system and method for fusion of a virtual scene and a real scene. Images of a user in the real scene are shot, the real scene objects are extracted under the perspective of the user, and various environmental parameters in the real scene are collected comprehensively; and then, the influence weights of all real scene objects and all real environment factors on the user perception during the process that the user enters the virtual reality are judged; according to the rank of the influence weights, the real scene object and real environment factor which have higher influence weights are selected to be fused with the virtual scene, while the object and factor which have influence weights below the threshold are ignored; the size of the influence weight and the spatial position of the real scene object itself are integrated into account, the real scene object and the virtual scene object are harmonized to complete a common display; the compatibility level of the real scene object and the virtual scene is determined, and the imaging model of the real scene object is adjusted accordingly.

Description

technical field [0001] The invention belongs to the technical field of computers, and in particular relates to a model system and method for fusing virtual scenes and real scenes. Background technique [0002] The very popular virtual reality technology in recent years is based on the comprehensive application of computer 3D graphics technology, sensing technology, human-computer interaction technology, and 3D display technology to present users with a highly realistic 3D visual perception and realize 3D display. The high real-time interaction between the world and the user's real-world behavior makes the user experience an experience infinitely close to the real world. The ability to produce immersive effects is an important feature of virtual reality technology that distinguishes it from other graphic display and human-computer interaction technologies. In the environment created by reality, it is difficult to consciously and subconsciously divide the virtual reality worl...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06T19/00
CPCG06F3/011G06F2203/012G06T19/006G06T2200/08
Inventor 左曾旭衡
Owner 深圳前海小橙网科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products