Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Infrared human body behavior recognition method based on MobileNetV3 network model

A network model and recognition method technology, applied in the field of deep learning, can solve problems such as waste of computing resources, overfitting, insufficient training data, etc., and achieve the effect of improving accuracy, reducing parameters and calculation amount

Pending Publication Date: 2021-02-23
XINAN JIANGSU ELECTRIC APPLIANCE CO LTD
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Continuously increasing the depth of the network can improve the performance of the network, but with the continuous increase of the number of network layers, it will bring two main problems: one is over-fitting and insufficient training data; the other is the waste of computing resources, because in the network The output of the activation function of a large number of neurons will eventually become zero

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Infrared human body behavior recognition method based on MobileNetV3 network model
  • Infrared human body behavior recognition method based on MobileNetV3 network model
  • Infrared human body behavior recognition method based on MobileNetV3 network model

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0027] In order to make the purpose, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments It is some embodiments of the present invention, but not all of them. Based on the implementation manners in the present invention, all other implementation manners obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention. Accordingly, the following detailed description of the embodiments of the invention provided in the accompanying drawings is not intended to limit the scope of the claimed invention, but merely represents selected embodiments of the invention. Based on the implementation manners in the present invention, all other implement...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses an infrared human body behavior recognition method based on a MobileNetV3 network model. The infrared human body behavior recognition method comprises the following steps: S1,shooting an infrared video indoors; S2, selecting multiple persons to participate in shooting, and the implementing actions including sitting, standing, lying, walking, running and jumping; S3, 230 small videos are shot in total, the 230 small videos are cut into 62806 images, and the frame rate is 30 fps; s4, dividing the data sets into 88% of training sets, 10% of verification sets and 2% of test sets while dividing the data sets into six types of data sets; s5, generating a training and verification txt label file and generating seven read files to be conveniently sent to the network; S6, sending to a neural network for training; and S7, carrying out model prediction, and realizing action recognition and prediction. According to the method, the lightweight neural network model MobileNetV3 is used, the accuracy is improved while the parameters and the calculated amount are reduced, and the method can be used on various mobile terminals including mobile phones, embedded devices such as jetson nano and the like in the later period.

Description

technical field [0001] The invention relates to the field of deep learning technology, in particular to an infrared human behavior recognition method based on the MobileNetV3 network model. Background technique [0002] Human behavior recognition is an important research in the field of computer vision at present. It has a wide range of applications in many aspects such as video surveillance, smart home, and human-computer interaction. It not only has important scientific significance, but also has very critical practical value. [0003] Human behavior recognition is to obtain data sources from sensors, videos and other tools to form a reference database for subsequent network learning. Sensor data is mainly collected by accelerometers, gyroscopes, etc. to collect acceleration and angular velocity data of the human body during different actions to form one-dimensional data; video data is divided into various types, and the data sets available for download on the Internet inc...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00G06K9/62G06N3/04G06N3/08
CPCG06N3/08G06V40/20G06N3/045G06F18/24G06F18/214
Inventor 卢新彪徐嘉雯
Owner XINAN JIANGSU ELECTRIC APPLIANCE CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products