Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network training and construction method and device, and object detection method and device

A technology of neural network and training method, which is applied in the field of target detection method and device based on neural network, and can solve problems such as inaccurate target recognition, low detection efficiency, and impact

Active Publication Date: 2017-01-04
BEIJING KUANGSHI TECH +1
View PDF3 Cites 81 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the target detection results obtained by the current CNN-based target detection method still have technical problems such as being affected by the inherent changes of the target, inaccurate target recognition, and low detection efficiency.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network training and construction method and device, and object detection method and device
  • Neural network training and construction method and device, and object detection method and device
  • Neural network training and construction method and device, and object detection method and device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0100] In order to make the objects, technical solutions, and advantages of the present disclosure more apparent, exemplary embodiments according to the present disclosure will be described in detail below with reference to the accompanying drawings. Apparently, the described embodiments are only some of the embodiments of the present disclosure, rather than all the embodiments of the present disclosure, and it should be understood that the present disclosure is not limited by the exemplary embodiments described here. Based on the embodiments described in the present disclosure, all other embodiments obtained by those skilled in the art without creative effort shall fall within the protection scope of the present disclosure.

[0101] Firstly, the basic concepts involved in the present disclosure and the basic idea of ​​training and constructing the neural network for object detection according to the embodiments of the present disclosure are briefly introduced.

[0102] As we ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention provides a neural network training and construction method and device for object detection, and an object detection method and device based on a neural network. The neural network training and construction method for object detection comprises the steps: inputting a training image comprising a training object into the neural network, so as to obtain a prediction boundary frame of the training object; obtaining a first loss function according to the ratio of the intersection area and union area of the prediction boundary frame and a real boundary frame, wherein the real boundary frame is the boundary frame of the training object marked in the training image in advance; and adjusting the parameters of the neural network at least through the first loss function, so as to carry out the training of the neural network. According to the invention, the first loss function is employed for the regression of a target boundary frame into an integrated unit, and the object detection precision of the neural network is remarkably improved. Moreover, the training and detection efficiency of the neural network can be effectively improved through two branch structures of the neural network.

Description

technical field [0001] Embodiments of the present disclosure relate to image processing, and in particular to a method and device for training and constructing a neural network for object detection, and a method and device for object detection based on a neural network. Background technique [0002] Object detection is a basic research topic in the field of computer vision, and it has broad application prospects in many aspects such as face recognition, security monitoring, and dynamic tracking. Target detection refers to detecting and identifying a specific target (such as a face) in any given image, and returning the position and size information of the target, such as outputting a bounding box surrounding the target. Target detection is a complex and challenging pattern detection problem. There are two main difficulties. On the one hand, it is caused by internal changes such as target detail changes and occlusions. On the other hand, it is caused by imaging angles and lig...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06K9/62G06K9/00G06V10/764G06V10/774
CPCG06V40/161G06F18/214G06T2207/20084G06T7/11G06N3/08G06V40/168G06V10/82G06V10/764G06V10/774G06V40/172G06F18/24G06T7/337G06T2207/20132G06T2207/30201G06T2210/12
Inventor 余家辉印奇
Owner BEIJING KUANGSHI TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products