Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for grabbing object through man-machine cooperation in VR scene

An object and scene technology, applied in the field of virtual reality, can solve the problems that users cannot operate accurately and efficiently, and the distance of 3D virtual objects is too far away.

Active Publication Date: 2020-02-07
UNIV OF JINAN
View PDF2 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, when people actually interact with virtual reality technology, there are often 3D virtual objects that are too far away and occluded so that users cannot accurately and efficiently operate the virtual objects they want to interact with.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for grabbing object through man-machine cooperation in VR scene
  • Method for grabbing object through man-machine cooperation in VR scene
  • Method for grabbing object through man-machine cooperation in VR scene

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0051] The present invention will be described in further detail below in conjunction with the accompanying drawings:

[0052] The present invention uses Inter RealSense to estimate gestures and drive virtual human hands, uses the iFLYTEK Speech Recognition SDK to recognize the user’s voice input, namely audio signals, and then grammatically segment the recognized voice to obtain the user’s operation intention. The situational awareness enables the scene to actively change to assist the user in completing the operation. The steps of the method of the present invention are as figure 1 Shown, including:

[0053] Gesture recognition and speech recognition are performed at the same time, where the gesture recognition is to collect the video signal of the human hand, drive the movement of the virtual human hand in the virtual scene according to the video signal, and obtain the gesture recognition result, namely figure 1 The voice recognition is to collect the user’s audio signal, identi...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a method for grabbing an object through man-machine cooperation in a VR scene, and belongs to the technical field of virtual reality. The method comprises the steps that gesturerecognition and voice recognition are carried out at the same time, the gesture recognition comprises the steps that video signals of a human hand are collected, the virtual human hand in a virtual scene is driven to move according to the video signals, and a gesture recognition result is obtained; wherein the voice recognition comprises the steps of collecting an audio signal of a user, performing recognition and semantic analysis on the audio signal, and obtaining a voice recognition result according to the semantic analysis; and performing scene active conversion according to the gesture recognition result and the voice recognition result to assist the user to complete the operation. By utilizing the method, the user can be assisted to accurately and efficiently operate the virtual object which the user wants to interact.

Description

Technical field [0001] The invention belongs to the field of virtual reality technology, and specifically relates to a method for human-machine collaborative grabbing objects in a VR scene. Background technique [0002] Virtual reality (VR) technology is a product of technological development in today's era, combined with a variety of human-computer interaction interface technologies, can provide users with multi-channel input and perception interfaces. However, when people actually use virtual reality technology to interact, there are often 3D virtual objects that are too far away and occluded so that users cannot accurately and efficiently operate the virtual objects they want to interact with. Summary of the invention [0003] The purpose of the present invention is to solve the above-mentioned problems in the prior art and provide a method for human-machine collaborative grabbing objects in a VR scene, which can assist users to accurately and efficiently operate the virtual ob...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T19/00G06F3/01G10L15/18G10L15/22
CPCG06T19/006G06F3/017G10L15/1822G10L15/22G10L2015/223G10L2015/225Y02D10/00
Inventor 冯志全李健
Owner UNIV OF JINAN
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products