Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method and system for extracting key frame from motion capture data and reconstructing motion

A key frame and data technology, applied in image data processing, animation production, instruments, etc., can solve the problems of redundancy, unfavorable motion capture data compression, storage, retrieval and further reuse, huge data volume, etc.

Inactive Publication Date: 2020-09-18
北京中科深智科技有限公司
View PDF2 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the motion capture data is obtained at a relatively high sampling frequency, with an average of dozens or even hundreds of frames per second, and each frame contains the rotation information of a dozen or even dozens of joint points. The obtained data is huge and meaningful. A large amount of data redundancy is not conducive to the compression, storage, retrieval and further reuse of motion capture data, so it is necessary to extract key frames representing the content of motion data
However, the existing key frame extraction methods are difficult to accurately extract key frames, resulting in poor authenticity and fidelity of motion postures reconstructed from key frames, which cannot be applied to human character animation production

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method and system for extracting key frame from motion capture data and reconstructing motion
  • Method and system for extracting key frame from motion capture data and reconstructing motion
  • Method and system for extracting key frame from motion capture data and reconstructing motion

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0080] The technical solutions of the present invention will be further described below in conjunction with the accompanying drawings and through specific implementation methods.

[0081] Wherein, the accompanying drawings are only for illustrative purposes, showing only schematic diagrams, rather than physical drawings, and should not be construed as limitations on this patent; in order to better illustrate the embodiments of the present invention, some parts of the accompanying drawings will be omitted, Enlargement or reduction does not represent the size of the actual product; for those skilled in the art, it is understandable that certain known structures and their descriptions in the drawings may be omitted.

[0082] In the drawings of the embodiments of the present invention, the same or similar symbols correspond to the same or similar components; , "inner", "outer" and other indicated orientations or positional relationships are based on the orientations or positional ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method and a system for extracting a key frame from motion capture data and reconstructing motion. According to the invention, the distance between quaternions is used to represent the human body posture difference; the total change on each joint of the human body is used as a frame spacing; the first motion frame is taken as a first key frame; the difference between thecurrent frame and the last key frame is calculated through continuous iteration, frames of which the difference is smaller than the threshold value are eliminated, frames of which the difference exceeds the threshold value are used as key frames and are extracted and stored, and then the extracted key frame set is subjected to motion reconstruction by adopting a quaternion spherical interpolationmethod, so that the original motion capture data has a relatively high compression ratio, and the trueness and fidelity of original animation restoration are ensured.

Description

technical field [0001] The invention relates to the technical field of computer animation production, in particular to a method and system for extracting key frames from motion capture data and reconstructing motion. Background technique [0002] In computer animation, human character animation is an important part, but due to the high degree of freedom of human motion, it is very difficult to create a realistic and realistic motion model. At present, the production of human character animation mainly adopts animation production based on kinematics and animation production based on motion capture data (motion capture data), among which the animation production method based on motion capture data is the most widely used. However, the motion capture data is obtained at a relatively high sampling frequency, with an average of dozens or even hundreds of frames per second, and each frame contains the rotation information of a dozen or even dozens of joint points. The obtained dat...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T13/40
CPCG06T13/40
Inventor 不公告发明人
Owner 北京中科深智科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products