Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Security scene flame detection method based on deep learning

A technology of deep learning and flame detection, which is applied in the field of flame detection in security scenes based on deep learning, can solve problems such as inability to obtain flame dynamic information, false alarms, and recognition of objects similar to flames as flames, etc.

Pending Publication Date: 2020-10-23
成都睿沿科技有限公司
View PDF0 Cites 8 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] The purpose of the present invention is to provide a flame detection method for security scenes based on deep learning, which solves the problem that in the existing method for flame detection, the dynamic information of the flame cannot be obtained, and it is easy to identify objects similar to the flame as flames Alarm and other above-mentioned technical problems

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Security scene flame detection method based on deep learning
  • Security scene flame detection method based on deep learning
  • Security scene flame detection method based on deep learning

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0039] Such as Figure 1 ~ Figure 4As shown, the present invention is based on deep learning security scene flame detection method, comprising the following steps:

[0040] S1. A single-stage detection model for recognizing flame shapes is trained through a deep learning neural network;

[0041] S2. Train a class behavior recognition and classification model for recognizing flame dynamic changes through deep learning neural network;

[0042] S3. Return the video captured by the monitoring camera in real time to the background server;

[0043] S4. The background server decodes the returned video stream data into multiple frames of pictures;

[0044] S5. Input the picture obtained in step S4 into the single-stage detection model, whether the single-stage detection model detects whether there is a region suspected of flame, if not, repeat steps S3 and S4; if there is, then output the region suspected of flame in the figure;

[0045] S6. According to the suspected flame area id...

Embodiment 2

[0054] This embodiment is a specific description about the training of the single-stage detection model in Embodiment 1.

[0055] Said S1 specifically includes the following steps:

[0056] a. Data preparation: shooting and / or collecting flame videos;

[0057] b. Labeling: first use Opencv to decode the video into a picture, then use labeling software such as Labelimg or Labelme to mark out the flame in the picture, and use a rectangular frame to frame it, such as image 3 shown; and obtain the position (x, y, w, h) format of the flame in the image according to the annotation, where x, y are the coordinates of the upper left corner of the rectangular frame where the flame is located, and w and h are the width and height of the flame rectangular frame;

[0058] c. Training: Use the pure yolov3 full network or efficient-bo framework as the backbone, followed by the yolov3 lightweight detection head as the network structure of the single-stage detection model, and then use the m...

Embodiment 3

[0061] This embodiment is a specific description about the training of the class behavior recognition and classification model in the first embodiment.

[0062] Said S2 specifically includes the following steps:

[0063] A, data preparation: shooting and / or collecting flame video;

[0064] B. Annotation: Annotators mark the start frame and end frame of each video flame and the position of the flame;

[0065] C. Training: Use the ECO behavior recognition network structure as the network structure of the class behavior recognition classification model. Among the marked videos obtained in step B, the events from a flame occurrence to the end are counted as positive sample events, and those not marked as flames are long enough The video segment is recorded as a negative sample event, and 16 frames are sampled from both positive and negative sample events as network input, and the positive and negative events are used as labels to calculate the difference between the predicted res...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a security and protection scene flame detection method based on deep learning, which belongs to the technical field of security and protection, and mainly comprises the following steps of: detecting a suspected flame area of a picture decoded by a monitoring video through a single-stage detection model which is trained based on a neural network and is used for identifying aflame shape; according to the identified suspected flame area, performing video frame interception on the area corresponding to the suspected flame area in the video to obtain a video frame; dividingthe video frame into N sub-segments, and sampling one frame from each sub-segment to obtain sampling truth; and finally, inputting the sampling frame into a class behavior identification classification model which is trained based on a neural network and is used for identifying the dynamic change of the flame so as to classify whether the flame is the flame. According to the method, the appearance features of the single frame of the suspected flame are extracted through the single-stage detection model, the dynamic information of the front frame and the rear frame is considered, the final classification effect is greatly improved through richer features, and the real-time performance and high efficiency of flame detection are improved.

Description

technical field [0001] The invention relates to the field of security technology, in particular to a method for detecting flames in security scenes based on deep learning. Background technique [0002] Among various disasters, fire is one of the main disasters that most frequently and commonly threaten public safety and social development. Fires are basically disasters caused by small fires gradually spreading into large fires and burning out of control in time or space. When a large fire is formed, it will not only extinguish the misery, but also cause more dangerous situations such as explosions due to many uncontrollable factors in the environment, which will not only cause property loss, but also cause casualties. Therefore, it is necessary to detect the fire point in the environment, especially the environment lacking direct monitoring by personnel, so that it can be discovered and eliminated in the initial stage, and will not develop into a big fire or even a fire. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/00G06K9/20G06K9/62G06N3/04G06N3/08
CPCG06N3/084G06V20/42G06V20/49G06V20/46G06V20/52G06V10/22G06V2201/07G06N3/045G06F18/241G06F18/253
Inventor 吉翔
Owner 成都睿沿科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products