Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body activity recognition method based on grouping residual joint spatial learning

A human activity and residual technology, applied in neural learning methods, character and pattern recognition, instruments, etc., can solve problems such as low accuracy, and achieve the effect of improving accuracy, increasing inter-class distance, and reducing intra-class distance

Pending Publication Date: 2020-08-28
ZHEJIANG UNIV OF TECH
View PDF8 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] In order to overcome the low accuracy of existing human activity recognition methods and to solve the problem of human activity recognition based on sensor data, the present invention proposes a human activity recognition method based on group residual joint space learning, which effectively solves the problem of Solved the problem of human activity recognition and improved the accuracy of human activity recognition

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body activity recognition method based on grouping residual joint spatial learning
  • Human body activity recognition method based on grouping residual joint spatial learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0023] The present invention will be further described below in conjunction with the accompanying drawings.

[0024] refer to figure 1 and figure 2 , a human activity recognition method based on joint spatial learning of group residuals, including the following steps:

[0025] Step 1. Group, align, and slice single-channel data based on the sliding window. The process is as follows:

[0026] Step 1.1: Collect human activities through wearable sensors, object sensors, and environmental sensors to generate a subset of gesture activity category data, and combine various human activity signals to construct a human activity recognition data set;

[0027] Step 1.2: Use the sliding window method to group, align and slice the data, convert the serialized data into single-channel two-dimensional data, and use the two-dimensional data to extract high-level abstract semantic features;

[0028] Step 1.3: Classify the two-dimensional data subset according to the gesture activity catego...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

A human body activity recognition method based on grouping residual joint spatial learning comprises the following steps: step 1, collecting human, object and environment signals by using various sensors, grouping, aligning and slicing single-channel data based on a sliding window, and constructing a two-dimensional activity data subset; step 2, building a grouping residual convolutional neural network, and constructing a joint space loss function optimization network model by utilizing a center loss function and a cross entropy loss function in order to extract a feature map of a two-dimensional activity data subset; and step 3, training a multi-classification support vector machine by utilizing the extracted two-dimensional features to realize a human body activity classification task based on the feature map. According to the invention, fine human body activities can be identified; the inter-class distance of the extracted spatial features is increased in combination with a joint spatial loss function, and the intra-class distance is reduced; based on the spatial feature map of the human body activity data, a multi-classification support vector machine is combined to carry outclassification learning on the feature map, and the accuracy of human body activity classification is improved.

Description

technical field [0001] The present invention relates to the problem of human body activity recognition, in particular to an activity recognition method based on human-computer interaction and deep learning. The invention realizes human body activity recognition based on group residual joint convolutional neural network and support vector machine, and belongs to the field of combining human-computer interaction, activity recognition and deep learning. Background technique [0002] With the advent of deep learning, the concept of human-computer interaction has been further developed in the context of current applications. The future world will be a world where everything is interconnected, where people and the environment are gradually integrated, and information on human activities needs to be collected in real time and processed effectively. Therefore, the research on human activity recognition has received extensive attention from researchers. The research on human activi...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06N3/048G06N3/045G06F2218/12G06F18/241G06F18/2411
Inventor 吕明琪陈文青
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products