Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for accurately positioning candidate bounding boxes in target segmentation network

A technology of target segmentation and precise positioning, applied in biological neural network model, image analysis, image data processing and other directions, it can solve the problems of multiple detection of the same target, four-dimensional position coordinate deviation of candidate bounding boxes, missed target detection, etc.

Pending Publication Date: 2020-11-24
ZHEJIANG UNIV OF TECH
View PDF0 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] After the above process, the four-dimensional position coordinates of the candidate bounding box (center position coordinates and its width and height) often obtained have a great deviation from the real frame where the pre-calibrated target is located, and it also leads to two problems.
First, if the overlap rate between the obtained candidate bounding box and the real calibration box is greater than the predetermined threshold, it will cause the candidate bounding box to be eliminated, and it is very likely that some targets will be missed.
Second, it is often difficult to pre-set the overlap rate threshold between the candidate bounding box and the real calibration box. If the setting is too small, some targets may be missed. If the threshold is set too large, it is easy to cause the same target. The case of multiple detections or false detections

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for accurately positioning candidate bounding boxes in target segmentation network
  • Method for accurately positioning candidate bounding boxes in target segmentation network
  • Method for accurately positioning candidate bounding boxes in target segmentation network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0070] In order to overcome the above-mentioned deficiencies of the prior art, the present invention provides a method for precise positioning of candidate bounding boxes in the target segmentation network for problems such as positioning deviation, missing detection, wrong detection, and partial target detection in the target segmentation network. . First construct a lightweight convolutional neural network model, and rationally design its network parameters to perform operations such as transmission and convolution on the inaccurate prediction map of the currently obtained candidate bounding box; then intersect and merge in the network The ratio and confidence score are designed and solved; finally, the redundancy of candidate bounding boxes is removed according to the setting of intersection ratio and confidence score, and the candidate bounding boxes are fine-tuned to achieve the effect of precise positioning.

[0071] To achieve the above object, the present invention ado...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a method for accurately positioning candidate bounding boxes in a target segmentation network, and the method comprises the steps: 1, constructing a lightweight convolutional neural network model, and designing network parameters; 2, solving an intersection-to-parallel ratio IoU of an IoU layer in the network model; 3, solving confidence scores of candidate bounding boxes in the network model; and 4, removing the redundancy of the candidate bounding box according to the setting of the intersection-parallel ratio and the confidence score, and finally enabling the confidence score of the output candidate bounding box to be highest and closest to a real calibration box. According to the method, a balance position can be found between target missing detection and errordetection, the candidate bounding box is closer to a real calibration box, and meanwhile, the network structure and the confidence coefficient solving method designed by the invention have certain generalization ability and can be applied to a relatively complex scene.

Description

technical field [0001] The invention relates to a method for precise positioning of candidate bounding boxes in a target segmentation network. [0002] technical background [0003] When applying machine learning to segment the target object in the image, it is first necessary to scale the input original image, and then input the fixed-size image into the pre-set convolutional neural network for feature extraction, and then in the feature map Anchor points are performed on the object, and then the region of interest is extracted to determine the candidate bounding boxes of the foreground (target object) and the background. [0004] After the above-mentioned process, the four-dimensional position coordinates of the candidate bounding box (center position coordinates and its width and height) are often obtained. There is a great deviation from the real frame where the pre-calibrated target is located, and it also leads to two problems. . First, if the overlap rate between the...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T7/62G06T7/136G06T7/11G06N3/04G06K9/32
CPCG06T7/62G06T7/11G06T7/136G06T2207/20084G06V10/25G06N3/045
Inventor 张烨樊一超陈威慧
Owner ZHEJIANG UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products