Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Human body action classification method and device, terminal equipment and storage medium

A technology of human action and classification method, applied in the field of image analysis, can solve the problems of low efficiency of human action classification results, slow calculation speed and high computational complexity

Pending Publication Date: 2021-04-06
SHENZHEN UNIV
View PDF2 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The main purpose of the present invention is to provide a human action classification method, device, terminal equipment and computer-readable storage medium, aiming to solve the existing human action classification method in the prior art, which has high computational complexity and slow computational speed , a technical problem that makes the efficiency of obtaining human action classification results low

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Human body action classification method and device, terminal equipment and storage medium
  • Human body action classification method and device, terminal equipment and storage medium
  • Human body action classification method and device, terminal equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0060] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only part of the embodiments of the present invention, not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0061] With the development of GPU (Graphics Processing Unit), deep convolutional neural network (DCNN) has played an important role in radar (especially micro-Doppler radar) based human action classification technology with its powerful feature extraction ability. Before sending the radar signal to the DCNN, the radar signal is usually preprocessed to make it more expressive, and the most common preprocessing method is to convert the radar signal into...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a human body action classification method, which comprises the following steps of: transmitting electromagnetic waves to a target person by utilizing radar so as to obtain a target echo of the target person; obtaining a target spectrogram based on the target echo; and inputting the target spectrogram into a one-dimensional human body action classifier obtained by training to perform classification so as to obtain a human body action classification result of the target person. The invention further discloses a human body action classification device, terminal equipment and a computer readable storage medium. By means of the human body action classification method, the efficiency of obtaining the human body action classification result by the terminal equipment is improved, and the user experience is good.

Description

technical field [0001] The present invention relates to the technical field of image analysis, in particular to a human action classification method, device, terminal equipment and computer-readable storage medium. Background technique [0002] Human action classification techniques play an important role in various applications, such as security defense and smart environments. The early human action classification technology mainly used optical cameras to collect optical images of human actions, and input the collected optical images into the human action classification model to obtain the classification results of human actions. Although traditional optical cameras have high imaging quality, optical cameras are highly dependent on lighting conditions. When choosing bad weather such as rain, snow and haze, the quality of optical images captured by optical cameras is at a premium. [0003] Therefore, the related technology center proposed a human action classification metho...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/62G06N3/04G06N3/08G01S13/89
CPCG06N3/084G01S13/89G06N3/048G06N3/045G06F18/241
Inventor 叶文彬赖国基
Owner SHENZHEN UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products