Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Motion control and animation generation method based on acceleration transducer

An acceleration sensor and motion control technology, applied in the field of computer virtual reality, can solve the problems of poor recognition rate and ignore inherent physical meaning, and achieve the effect of enhancing user experience, improving motion recognition rate, and improving motion recognition rate.

Inactive Publication Date: 2011-02-16
BEIHANG UNIV
View PDF3 Cites 12 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this method can better reduce the dimensionality of high-dimensional signal data, applying this method to the acceleration sensor ignores the inherent physical meaning of the information obtained by the acceleration sensor, resulting in a poor recognition rate.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Motion control and animation generation method based on acceleration transducer
  • Motion control and animation generation method based on acceleration transducer
  • Motion control and animation generation method based on acceleration transducer

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0042] Below in conjunction with accompanying drawing and embodiment the present invention is described in further detail:

[0043] The implementation process of the present invention includes four main steps: analysis of key joint points, feature extraction based on physical meaning, matching between low-dimensional acceleration sensor signals and high-dimensional bone motion data, and dynamic timing attribute adjustment of motion data results. Such as figure 1 shown.

[0044] Step 1, that is, the analysis of key joint points, is mainly divided into two stages:

[0045] The first stage: principal component analysis on the covariance matrix of raw motion data

[0046] PCA is a common method of dimension reduction analysis, but the disadvantage of using PCA to perform dimension reduction analysis on the number of joints is that it disrupts the original joint semantics, making it difficult to select key joint points. To this end, we first use PCA to perform dimension reductio...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a motion control and animation generation method based on an acceleration transducer. The invention belongs to the technical field of computer virtual reality, in particular to a motion control and animation generation method in computer animation technologies. The method comprises the following steps of: firstly, analyzing a motion to be recognized to obtain key joint point information in the motion process; then, extracting the features of transducer data and motion data in the position of the key joint point based on physical meaning, and carrying out subsequent motion classification and motion recognition. Signals are segmented and a motion recognition classifier is created according to the characteristic sequence of the signal data of the acceleration transducer, and characteristics based on kinetic energy are used as central characteristics to match with bone motion data; in order to increase user experience, the time sequence of the finally obtained recognition motion result is adjusted so as to conform to the motion time sequence of on-line performance of a user. The invention has the characteristic of realizing real-time interactive control on virtual human whole-body motion by using a few transducers.

Description

technical field [0001] The invention belongs to the technical field of computer virtual reality, in particular to a motion control and animation generation method in computer animation technology. Background technique [0002] With the continuous development of virtual reality technology, real-time virtual character motion control technology has been widely used in more and more fields such as animation, games, and virtual reality systems. For real-time virtual character control technology, a good user control interface plays a decisive role. Although the motion capture system can accurately capture the movement of characters, and then apply it to virtual character control, dynamic surface generation, etc., it also has many disadvantages, such as: the whole set of equipment is heavy and expensive; because the whole set of systems needs to be arranged in a fixed space, The types of sports that can be collected are limited; the collected data is in a high-dimensional space, a...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T13/00
Inventor 梁晓辉刘宇波何志莹岑丽霞刘杰
Owner BEIHANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products