Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Multi-mode complex activity recognition method based on deep learning model

A technology of deep learning and activity recognition, applied in the field of activity recognition, to achieve the effect of improving accuracy and increasing non-linear capabilities

Active Publication Date: 2018-12-07
ZHEJIANG UNIV
View PDF3 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0007] The technical problem to be solved in the present invention is how to effectively use multi-modal time-series data for complex activity recognition, and propose a multi-modal complex activity recognition method based on a deep learning model

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Multi-mode complex activity recognition method based on deep learning model
  • Multi-mode complex activity recognition method based on deep learning model
  • Multi-mode complex activity recognition method based on deep learning model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0038] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, and do not limit the protection scope of the present invention.

[0039]In order to solve the problems in the background technology, the present invention defines complex activities as a simple semantic sequence of activities, and then extracts sequence features to complete complex activity recognition tasks. Firstly, the time series data of different modalities are divided into three categories according to their attributes; then convolutional sub-networks with different structures are constructed for feature extraction; then the longitudinal splicing layer and convolutional layer are used to fuse the features of different modalities; ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a multi-mode complex activity recognition method based on a deep learning model. To be specific, the method comprises: step one, classifying different-mode time sequence data into different types and carrying out expression extraction by using convolutional neural networks (CNN) with different structures; step two, carrying out fusion on expressions in different modes by using a longitudinal splicing layer and the convolutional layers; and step three, extracting sequence features further by using an LSTM network to obtain a complex activity tag. According to the invention, complex activities are identified by using a deep learning model. The multi-mode complex activity recognition method has the broad application prospects in fields of health care, industrial assistance, skill evaluation and the like.

Description

technical field [0001] The invention belongs to the field of activity recognition, and in particular relates to a multimodal complex activity recognition method based on a deep learning model. Background technique [0002] Activity recognition is a basic and important research direction in the field of ubiquitous computing. With the development and popularization of wearable devices, activity recognition has been widely used in elderly assistance, newborn monitoring, and skill assessment. [0003] According to whether the activity label contains advanced semantics, activity recognition can be divided into simple activity recognition and complex activity recognition. Simple activities usually consist of periodic movements or single postures of the human body, such as standing, sitting, walking, running, etc. Complex activities are usually composed of simpler activities that last longer and have high-level semantics, such as eating, working, shopping, etc. The current metho...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06N3/04
CPCG06N3/049G06N3/045G06F18/253
Inventor 陈岭刘啸泽
Owner ZHEJIANG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products