Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Auxiliary modeling method and system, intelligent wearable device and VR device

A modeling method and technology of smart devices, applied in the field of VR, can solve the problems of lack of interaction in VR applications, failure to provide a strong sense of immersive experience, etc.

Pending Publication Date: 2021-08-24
DONGGUAN ELF EDUCATIONAL SOFTWARE CO LTD
View PDF0 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] In the actual experience process, VR applications often lack interaction with users. Users usually experience by manipulating VR wearable devices, which fail to provide a strong sense of immersive experience.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Auxiliary modeling method and system, intelligent wearable device and VR device
  • Auxiliary modeling method and system, intelligent wearable device and VR device
  • Auxiliary modeling method and system, intelligent wearable device and VR device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0068] An embodiment of the present invention, such as figure 1 As shown, the present invention provides an auxiliary modeling method, comprising steps:

[0069] S1. Acquire the first image information of the preset space environment sent by the VR device.

[0070] S2. Generate a three-dimensional environment model according to the first image information.

[0071] Specifically, when a user performs a VR application, the VR device can capture first image information of a preset space environment, and by analyzing the first image information, a corresponding three-dimensional environment model can be generated. In this embodiment, the first image information of the preset space environment is collected by a TOF camera. In other embodiments, other similar camera devices may also be selected.

[0072] S3. Obtain the second image information of the wearer's body and the surrounding environment sent by the wearable smart device.

[0073] In addition, this solution adds a wearabl...

Embodiment 2

[0077] An embodiment of the present invention, such as figure 2 As shown, the present invention provides an auxiliary modeling method, comprising steps:

[0078] S1. Acquire the first image information of the preset space environment sent by the VR device.

[0079] S2. Generate a three-dimensional environment model according to the first image information.

[0080] Specifically, when a user performs a VR application, the VR device can capture first image information of a preset space environment, and by analyzing the first image information, a corresponding three-dimensional environment model can be generated. In this embodiment, the first image information of the preset space environment is collected by a TOF camera. In other embodiments, other similar camera devices may also be selected.

[0081] Preferably, generating a three-dimensional environment model according to the first image information specifically includes the steps of:

[0082] S21. Obtain the three-dimensio...

Embodiment 3

[0090] An embodiment of the present invention, such as image 3As shown, the present invention provides an auxiliary modeling method, comprising steps:

[0091] S1. Acquire the first image information of the preset space environment sent by the VR device.

[0092] S2. Generate a three-dimensional environment model according to the first image information.

[0093] Specifically, when a user performs a VR application, the VR device can capture first image information of a preset space environment, and by analyzing the first image information, a corresponding three-dimensional environment model can be generated. In this embodiment, the first image information of the preset space environment is collected by a TOF camera. In other embodiments, other similar camera devices may also be selected.

[0094] S3. Obtain the second image information of the wearer's body and the surrounding environment sent by the wearable smart device.

[0095] In addition, this solution adds a wearable...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an auxiliary modeling method and system, an intelligent wearable deice and a VR device. The method comprises the following steps: acquiring first image information of a preset space environment sent by the VR device; generating a three-dimensional environment model according to the first image information; acquiring second image information, sent by the intelligent wearable device, of the limbs and the surrounding environment of a wearer; and judging the behavior of the wearer according to the second image information, and controlling the 3D modeling doll to perform VR interaction in the three-dimensional environment model. According to the scheme, following interaction between a real person and a 3D virtual image can be realized, so that the immersive experience feeling of a user when the user uses the VR equipment is improved.

Description

technical field [0001] The invention relates to the technical field of VR, in particular to an auxiliary modeling method, a system, a wearable intelligent device and a VR device. Background technique [0002] VR technology integrates computer graphics technology, computer simulation technology, sensor technology, network parallel processing and other technologies, and is a technology that provides immersive experience in an interactive three-dimensional environment generated on a computer. The three-dimensional environment is called a virtual environment. Devices based on VR technology (such as VR wearable devices) can provide various VR applications, and output a virtual environment for immersive experience to users in VR applications. [0003] TOF (Time Of Flight) 3D imaging technology is to emit continuous infrared light pulses of a specific wavelength through a specific space target, and then a specific sensor receives the light signal sent back from the object to be me...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T17/00G06F3/01
CPCG06T17/00G06F3/011
Inventor 林泽填
Owner DONGGUAN ELF EDUCATIONAL SOFTWARE CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products