Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Super-joint and multi-modal network and behavior identification method thereof

A recognition method and super-joint technology, applied in the field of neural networks, can solve the problems of ignoring joint dependencies, unable to express dependencies with adjacent points, etc., to improve the recognition effect and improve the performance of action recognition.

Pending Publication Date: 2022-07-22
CHANGZHOU UNIV
View PDF0 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although the skeleton joints are specially designed to capture the spatial structure information of the human body, the information carried by the joint points can only represent the position but not the dependency with the adjacent points.
Previous methods usually use the position of joints as the spatial structure information representation of joints, while ignoring the dependencies between joints

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Super-joint and multi-modal network and behavior identification method thereof
  • Super-joint and multi-modal network and behavior identification method thereof
  • Super-joint and multi-modal network and behavior identification method thereof

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] The present invention will be further described below with reference to the accompanying drawings and embodiments. This figure is a simplified schematic diagram, and only illustrates the basic structure of the present invention in a schematic manner, so it only shows the structure related to the present invention.

[0061] In order to evaluate the effectiveness of the method of the present invention, experiments are carried out on a public dataset based on the depth map and skeleton information; a large public dataset can provide a wider range of training data for the model, making the model stronger; in order to verify the robustness of the method of the present invention Due to the nature of the dataset, a classic small dataset is used in the selection of the dataset. Therefore, experiments are conducted on several datasets with distinct scales: UTD-MHAD and NTU-RGB+D.

[0062] The invention is established based on the PyTorch framework, wherein the Python version is 3...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of neural networks, in particular to a hyper-articular and multi-modal network and a behavior recognition method thereof, and the method comprises the steps: collecting a human body depth map, carrying out the feature extraction of the depth map through a DMMS flow, and calculating a depth data prediction score; the method comprises the following steps: collecting a human skeleton sequence, respectively extracting original joint and hyper-joint data, constructing skeleton information by combining hyper-joints and common joints, sending the skeleton information into a structured spatial-temporal feature learning model to obtain static and dynamic joint data streams and static and dynamic hyper-joint data streams, and carrying out adaptive weight fusion on the data streams of the original joints and the hyper-joints to obtain a structured spatial-temporal feature learning model; respectively obtaining a joint data prediction score and a super joint prediction score; and adding the classification prediction scores of the DMMS stream and the skeleton stream to generate a final prediction score. According to the method, rich texture information of a human body part in space is learned from a depth map, and rich spatial-temporal characteristics in motion posture changes are learned from a skeleton sequence.

Description

technical field [0001] The invention relates to the technical field of neural networks, in particular to a hyper-joint and multi-modal network and a method for recognizing it in action. Background technique [0002] Human action recognition has become a hot topic in the field of machine learning and computer vision because of its broad application prospects, and has very important theoretical research value. Different from the recognition of static images, behavior recognition should comprehensively consider continuously changing images, and construct the spatiotemporal feature representation of human behavior from a series of motion sequences composed of static human pose images to realize the classification and recognition of actions. At present, human behavior recognition has been widely used in assisting human-computer interaction, sports analysis, intelligent monitoring and virtual reality and other fields. [0003] At present, behavior recognition algorithms are mainl...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06V40/10G06V40/20G06V10/34G06V10/44G06V10/80G06V10/764G06V10/82G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06N3/048G06N3/045G06F18/241G06F18/253
Inventor 侯振杰施海勇钟卓锟尤凯军
Owner CHANGZHOU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products