Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human behavior video identification method

A technology for video recognition and human body video, applied in the field of pattern recognition, it can solve problems such as poor results, no consideration of influence, and differences in clustering results.

Inactive Publication Date: 2017-03-22
UNIV OF SHANGHAI FOR SCI & TECH
View PDF3 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] However, this algorithm does not take into account the impact of each dimension on the similarity of data points, that is, the difference in the effect of each dimension on the clustering results
E.g and When calculating the similarity of data points, the effect of using simple Euclidean distance for similarity calculation is not good, and the clustering results cannot reflect the data dimensions (such as ) difference

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human behavior video identification method
  • Human behavior video identification method
  • Human behavior video identification method

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0095] Embodiment 1 divides the joint points of the human body into five parts according to the joint point information of the human body skeleton; then obtains the energy characteristics of the human body behavior of each frame of image respectively. Finally, the human behavior action image sequence is clustered by SWAP to obtain the cluster center, which is the key frame of the image.

[0096] This embodiment includes the following steps:

[0097] Divide the 20 skeleton joint points of the human body into five parts of the human body:

[0098] f 1,t :Torso(J1, J2, J3, J4) F 2,t : Right upper limb (J5, J6, J7, J8)

[0099] f 3,t : Left upper limb (J9, J10, J11, J12) F 4,t : Right lower limb (J13, J14, J15, J16)

[0100] f 5,t : Left lower limb (J17, J18, J19, J20), see for details figure 2 .

[0101] 2. Calculate the kinetic energy and potential energy of the five parts separately, and obtain the 10-dimensional human behavior energy features, each feature represents...

Embodiment 2

[0114] Embodiment 2 is as table 1, image 3 Shown is the key frame extraction result of the embodiment. Table 1 illustrates the number of key frames extracted from the drinking action. As can be seen from Table 1, this embodiment extracts 37 key frames for the drinking action containing 1583 frames of images. image 3 is the key frame extraction result, because the drinking action is a cyclic action, including multiple drinking actions, so image 3 It shows the key frame extraction results of a cycle of the drinking action, which contains 9 key frames.

[0115] Table 1 Description of Human Behavior Library CAD-60

[0116]

[0117] In this embodiment, six actions in CAD-60 (cutting vegetables, drinking water, rinsing mouth, standing, making a phone call, and writing on a blackboard) are used to illustrate that the present invention can improve the speed of human behavior recognition. Through the steps of the embodiment, key frames can be extracted for six actions, and ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention belongs to the field of mode identification, relates to a human behavior video identification method, and aims to enable an extracted key frame to be more representative and greatly increase and improve video identification speed and precision. The method comprises the steps of extracting a key frame of a human body video; and improving neighbor affine clustering by adding an adaptive weighted factor to obtain an adaptive weighted neighbor affine clustering algorithm, wherein the process of extracting the key frame of the human body video comprises the following steps of (1) performing human behavior representation through energy characteristics; and (2) obtaining the key frame by adopting SWAP clustering. According to the method, the SWAP clustering algorithm is applied to the extraction of the key frame of the human behavior video, so that the problem of differentiation of different human physiological structures is solved; different participation effects of the different human physiological structures on human behaviors are well reflected; the key frame can be quickly and reliably extracted; and the method has the great benefits for human behavior quick identification, video quick browsing and the like.

Description

technical field [0001] The invention belongs to the field of pattern recognition, and relates to a human behavior video recognition method, which can be applied to pattern recognition fields such as image retrieval, gesture recognition, and behavior recognition. Background technique [0002] Human behavior recognition research has important theoretical research value, which involves computer vision, sensor technology, pattern recognition and artificial intelligence and other disciplines. Traditional behavior recognition recognizes all the video frame information for the recognition of human behavior. In this way, human behavior contains a lot of information and requires a large amount of information to be processed. The task of human behavior recognition is heavy and the recognition time is long. In addition, the requirements for hardware are also more stringent. Schindler and Gool found that the amount of information contained in a video frame segment containing a small nu...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F17/30G06K9/00
CPCG06F16/784G06V40/103
Inventor 孙书鑫王永雄李凯
Owner UNIV OF SHANGHAI FOR SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products