Action identification method based on wearable device

An action recognition and wearable device technology, applied in the field of pattern recognition, can solve problems such as insufficient classification speed and computational complexity, inability to eliminate data interference, and high equipment complexity, so as to optimize the process of multi-classification problems, improve classification accuracy, The effect of reducing the amount of calculation

Inactive Publication Date: 2018-06-15
NANJING UNIV OF POSTS & TELECOMM
View PDF2 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] After searching the prior art literature, it was found that S.Karungaru et al published a paper titled "Human action recognition using wearable sensors and neural networks," on the 2015 10th Asian ControlConference (ASCC), which used multiple motion data acquisition nodes Collect motion data and use neural network algorithms to recognize human body movements, but devices with more than five nodes are too complex and are greatly limited in daily applications
[0004] Another search found that F.T.Liu et al. published a paper titled "Gesture recognition with wearable 9-axissensors," at the 2017 IEEE International Conference on Communications (ICC). The vector machine (SVM) method classifies and recognizes actions, but this method only makes some optimizations for feature extraction, and uses a simple window segmentation method for action data extraction, which cannot rule out the interference of data in non-stationary states, and in SVM multi-classification The traditional 1V1 strategy is used on the Internet, and there are deficiencies in classification speed and computational complexity

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action identification method based on wearable device
  • Action identification method based on wearable device
  • Action identification method based on wearable device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] In order to make the purpose and technical solutions of the embodiments of the present invention more clear, the technical solutions of the embodiments of the present invention will be clearly and completely described below in conjunction with the drawings of the embodiments of the present invention. Apparently, the described embodiments are some, not all, embodiments of the present invention. Based on the described embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative work all belong to the protection scope of the present invention:

[0063] Such as figure 1 Shown is a schematic diagram of the system model, which lists the detailed process of the design, including motion data collection, data transmission, feature extraction and classification recognition.

[0064] figure 2 is a schematic diagram of differential threshold action detection.

[0065] image 3 It is the comparison between the ker...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention discloses an action identification method based on a wearable device. Data is collected by using a single inertial sensor node, and compared with the scheme of multiple nodes, the actionidentification method has the advantages of being more convenient and comfortable to wear. Complete motion data are automatically sliced via a differential threshold detection method, thereby effectively reducing the interference of non-action data. Based on the analysis of five kinds of action features, a standard mean value, a standard deviation, a kurtosis, a skewness and a minimum value are extracted to serve as identification and classification feature values, and dimension reduction processing is performed on the features to reduce the redundancy of feature information. For the linear indivisible problem, a simpler kernel function is designed to reduce the computational complexity. A multi-classification process is implemented by multi-level SVM. Compared with the traditional 1V1 strategy, the design has the advantages of reducing the number of sub-SVMs, meanwhile adjusting the classification priority according to the size of the gravity center distance to improve the classification accuracy, and an effective scheme is provided for a wearable action identification design based on inertial data features.

Description

technical field [0001] The invention belongs to an action recognition method based on a wearable device, and belongs to the technical field of pattern recognition. Background technique [0002] With the development of microelectronics technology and artificial intelligence, intelligent electronic devices are becoming more and more popular in daily life, and the recognition of human body movements by devices has become a new type of human-computer interaction. In the existing action recognition system solutions, the action recognition method based on video analysis uses the motion feature information recorded in the video to analyze the action behavior. Among such applications, the Microsoft Kinect somatosensory game control platform is well-known. This type of method has a complex structure and data The amount of calculation is large, and the application scenarios are limited, so it is not suitable for portable wearable devices. In contrast, motion recognition based on iner...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06F3/01G06K9/00G06K9/62
CPCG06F3/011G06F3/017G06V40/28G06V40/20G06F18/2411
Inventor 司玉仕黄学军黄秋实
Owner NANJING UNIV OF POSTS & TELECOMM
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products