Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Action real-time monitoring method based on YOLO

A real-time monitoring and action technology, applied in the direction of instruments, character and pattern recognition, computer components, etc., can solve the problems of inaccurate detection, single detection method, poor robustness of segmentation effect, etc., and achieve the effect of ensuring efficiency

Active Publication Date: 2020-06-05
FOSHAN UNIVERSITY
View PDF5 Cites 7 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The robustness of the segmentation effect of the traditional method is poor, and it cannot distinguish the skin areas of different people in the image, and the effect in complex scenes is even less ideal.
[0004] In order to solve the common problems in the field such as single detection means, inaccurate detection and inability to monitor the action behavior of the target, the present invention is made

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Action real-time monitoring method based on YOLO
  • Action real-time monitoring method based on YOLO
  • Action real-time monitoring method based on YOLO

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0044]Embodiment 1: A method for real-time monitoring of actions based on YOLO. The monitoring method includes the following steps: S1: Establish a scene model, and the scene model is configured to collect video at the position of the camera; S2: Perform a video capture on the scene Object detection in the model to obtain the detection frame; S3: Construct and train the deep learning extraction network, input the detection frames of all people into the network, and obtain the feature vectors of all detection frames; S4: Predict the action vector of the video tracking target, Use the feature vector of the detection frame and the predicted action vector to match or predict the dangerous action of the tracking target to obtain the best matching and predicted detection frame; S5: Red mark the detection frame of the dangerous action; S6: Alarm prompt. In this embodiment, the camera adopts a 〖YOLOv〗_3 camera to ensure real-time monitoring of the scene and accurate identification of d...

Embodiment 2

[0045] Embodiment 2: A method for real-time monitoring of actions based on YOLO, the monitoring method includes the following steps: S1: establish a scene model, and the scene model is configured to collect video at the position of the camera; S2: perform a video capture on the scene Object detection in the model to obtain the detection frame; S3: Construct and train the deep learning extraction network, input the detection frames of all people into the network, and obtain the feature vectors of all detection frames; S4: Predict the action vector of the video tracking target, Use the feature vector of the detection frame and the predicted action vector to match or predict the dangerous action of the tracking target to obtain the best matching and predicted detection frame; S5: Red mark the detection frame of the dangerous action; S6: Alarm prompt. Specifically, the present invention discloses a method for real-time monitoring of actions based on YOLO, mainly for the purpose of ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides an action real-time monitoring method based on YOLO. The method comprises the following steps of: S1, building a scene model which is configured to carry out the video collection of a position where a camera is located, S2, detecting a target in the scene model to obtain a detection box, S3, constructing and training a deep learning extraction network, and inputting the detection boxes of all people into the network to obtain feature vectors of all the detection boxes, S4, predicting an action vector of the video tracking target, and performing dangerous action matchingor prediction on the tracking target by utilizing the feature vector of the detection frame and the predicted action vector to obtain an optimal matching and prediction detection frame, S5, performingred marking processing on the target detection frame of the dangerous action, S6, performing alarm prompt. By adopting the detection frame, red marking processing can be carried out, so that the camera pays attention, an object in the detection frame can be detected in real time, and the action of a person is ensured to be accurately recognized.

Description

technical field [0001] The invention relates to the technical field of motion monitoring, in particular to a YOLO-based real-time motion monitoring method. Background technique [0002] In community monitoring, the main thing is to rely on security guards to be on duty on the monitoring screen, which not only consumes a lot of manpower, but also has great defects in the manual on-duty monitoring screen, which cannot give real-time warnings to dangerous people and dangerous actions in the monitoring. [0003] For example, CN110569711A prior art discloses a Kinect-based graphic recognition. In static image recognition, when the operator is far away, the captured image is blurred, and image information cannot be accurately extracted. At the same time, there is a problem of slow processing speed in dynamic recognition. shortcoming. Another typical prior art such as CN102521579A discloses an action recognition method and system based on two-dimensional planar camera push, which ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06K9/00
CPCG06V40/20G06V2201/07
Inventor 李伟强王东杨戬陈向荣张宁毛文磊陈嘉欢
Owner FOSHAN UNIVERSITY
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products