Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Figure action recognition method based on VR practical training equipment

A technology for action recognition and characters, applied in the input/output process of data processing, input/output of user/computer interaction, instruments, etc. The effect of a simple, versatile, and simplified judgment method

Active Publication Date: 2019-08-27
郑州爱普锐科技有限公司
View PDF6 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

In this process, due to the continuous switching of the screens and scenes in the engine, the coordinate system in the engine is a coordinate system that is constantly changing. The current VR device uses the method of judging the coordinate points of the device handle, which needs to be adjusted for the device handle. Coordinate points perform a large number of records and judgments to determine the user's actions. This method has a huge amount of calculation and complex algorithms, which affects the overall fluency of the system.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Figure action recognition method based on VR practical training equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0011] The technical solutions of the present invention will be described in further detail below through specific implementation methods.

[0012] Such as figure 1 As shown, a method for character action recognition based on VR training equipment includes the following steps: S01, in the VR engine, establish a plurality of virtual quadrant separators; S02, associate the established quadrant separators to form a The separator group whose state information and position information are changed synchronously; S03. The center of the control separator group is always consistent with the center of the helmet in the VR engine; S04. Name each quadrant separator; A collider is added to the device for position or motion judgment; S06, by judging the collision and collision sequence between the collider and the quadrant divider, the judgment and tracking of the motion is realized.

[0013] In practice, a VR device includes a helmet, a handle, and at least two locators. The locator is us...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a figure action recognition method based on VR practical training equipment, and the method comprises the following steps: S01, building a plurality of virtual quadrant separators in a VR engine; s02, associating the established quadrant separators to form a separator group of which the state information and the position information are synchronously changed; s03, controlling the center of the separation body group to be consistent with the center of the helmet in the VR engine all the time; s04, naming each quadrant separator; s05, adding a collider to the equipment needing position or action judgment in the VR engine; and S06, judging and tracking the action by judging the collision between the collider and the quadrant separator and the collision sequence of the collider and the quadrant separator. The character action recognition method based on the VR practical training device has the advantages of being high in universality, simple in algorithm, accurate injudgment and easy to adjust and modify.

Description

technical field [0001] The invention relates to a character action recognition method based on VR training equipment. Background technique [0002] Virtual reality technology is a computer simulation system that can create and experience a virtual world. It uses a computer to generate a simulated environment, and maps the actions of people in reality to a three-dimensional dynamic scene. [0003] In existing VR devices, there are two locators. The locators will create a movable range in the real world. This range is the coordinate system established by the VR device in the real world. In the VR program engine, it also has its own coordinate system. These two coordinate systems are independent of each other. If you want to locate the player's position in the real world to the corresponding position in the engine world, this process requires coordinate conversion. For the motion state, in the real world, the player only performs certain actions, but does not perform real lon...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01
CPCG06F3/011
Inventor 秦子函王浩奇秦世豪
Owner 郑州爱普锐科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products