Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human movement expression method

A motion and algorithm technology, applied in instrumentation, computing, image data processing, etc., can solve problems such as multiple training data, high data dimensionality, and inaccurate parameter estimation

Inactive Publication Date: 2009-12-16
TSINGHUA UNIV
View PDF4 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0008] The main purpose of the present invention is to provide a method for expressing human motion to solve the current problem in the field of computer vision and intelligent analysis of video content that the data dimension before analysis using a linear time series model is particularly high. More training data is required, and the estimation of parameters is not accurate enough

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] The specific embodiments of the present invention are described below:

[0027] Step 1: The expression method of the pose space: the first is the expression of the pose based on the contour;

[0028] Contours are a good way to express a person's posture. It is not sensitive to changes in the human surface, such as the color and texture of clothes. Using M marker points P={p 1 , P 2 ,..., p M } To describe the outer contour of a person, then each contour can be represented by a complex vector z: z=(x 1 +jy 1 , X 2 +jy 2 ,..., x M +jy M ) T , Where x i And y i Respectively represent the i-th mark point p i The abscissa and ordinate. The expression of such a person's posture needs to be invariant to position and isotropic scale changes, so Z is normalized to z'. The real and imaginary parts of the complex vector z'constitute the representation of the contour.

[0029] The following describes the nonlinear mapping of contours:

[0030] The Local Linear Embedding Algorithm (LLE)...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a representation method of human motion, which maps the human motion to a low-dimensional embedding space through nonlinear dimension reduction; models the data after dimension reduction with a linear time series model. Among them, the human motion is mapped to a low-dimensional embedded attitude space through nonlinear dimensionality reduction; the realization is: z=(x1+jy1, x2+jy2,...,xM+jyM)T. Model the dimensionally reduced data with a linear time series model; implementation steps: motion modeling based on a linear time series model; establish a p-order autoregressive model (AR) including; a p-order autoregressive model (AR) AR (p) has the following parameters: coefficient matrix Ak∈Rm×m, the parameter v introduced to ensure that the mean value of the dynamic process is non-zero, and the covariance matrix Q of Gaussian white noise. Let Ak be a diagonal matrix, then z(t) Each component of is independent; Given two autoregressive models (AR) A=[v, A1, A2,..., Ap] and A'=[v', A1', A2',..., Ap ’], then the distance metric D(A, A’)=‖A-A′∥F, where ‖·‖F represents the F-norm of the matrix.

Description

Technical field [0001] The invention relates to a representation method of human motion, belonging to the field of computer vision and the field of intelligent analysis of video content. Background technique [0002] In the field of computer vision and video content intelligent analysis, human motion analysis has become a very important and cutting-edge research topic [1-7] . In human motion analysis, human motion detection and tracking belong to low-level processing in vision, while human motion expression and understanding belong to high-level processing. Human movement expression and understanding play a vital role in video intelligent surveillance and other application fields. [0003] In the past ten years, many ways of expressing people's movements have emerged [1-6] . Among them, most research work extracts static information in each frame of image or motion information of adjacent frames to represent human motion. Efros et al [1] The optical flow is used to express people...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06T17/00
Inventor 陈峰杜友田
Owner TSINGHUA UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products