Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Knowledge distillation method based on semantic segmentation intra-class feature difference

A technology of semantic segmentation and distillation method, which is applied in the field of knowledge distillation based on the feature difference within the semantic segmentation class, which can solve the problems of teacher model alignment, promotion, and limit the accuracy of the student model within the class feature difference of the student model, and guarantees Speed, improved performance, high-accuracy effects

Active Publication Date: 2020-04-24
HUAZHONG UNIV OF SCI & TECH
View PDF5 Cites 21 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the student model obtained by these existing methods is often difficult to align with the teacher model in terms of intra-class feature differences, which also limits the improvement of the accuracy of the student model.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Knowledge distillation method based on semantic segmentation intra-class feature difference
  • Knowledge distillation method based on semantic segmentation intra-class feature difference
  • Knowledge distillation method based on semantic segmentation intra-class feature difference

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0036] In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention. In addition, the technical features involved in the various embodiments of the present invention described below can be combined with each other as long as they do not constitute a conflict with each other.

[0037] Below at first explain and illustrate with regard to the technical terms of the present invention:

[0038] ResNet: Residual Network (Residual Network), is a classic deep convolutional neural network structure. It has achieved 5 first-place results in ILSVRC and COCO 2015 competitions, and its performance has greatly exceeded the second place. The network mainly consists of convolutional...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a knowledge distillation method based on semantic segmentation intra-class feature difference, and aims to migrate dark knowledge learned by a complex model (teacher model) toa simplified model (student model), thereby maintaining the speed of a semantic segmentation model while improving the accuracy of the semantic segmentation model. The method includes: firstly, obtaining convolution features through a teacher model and a student model respectively; then, obtaining a feature map of each category center through average pooling operation guided by a mask, and calculating feature similarity between each pixel point and the corresponding category center to obtain an intra-category feature difference map; and finally, aligning the intra-class feature difference graph of the student model with the teacher model so as to achieve the purpose of improving the accuracy of the student model. Compared with the prior art, the distillation method provided by the invention is novel in thought, the obtained semantic segmentation model achieves a good effect in the aspects of accuracy and speed, and meanwhile, the method can be conveniently combined with other related technologies and has a very high practical application value.

Description

technical field [0001] The invention belongs to the technical field of computer vision, and more specifically relates to a knowledge distillation method based on feature differences within semantic segmentation classes. Background technique [0002] Semantic segmentation refers to the pixel-level understanding of image scenes through computers, and is a basic research direction in the field of computer vision. In recent years, with the rapid development of fully convolutional neural networks, the accuracy of semantic segmentation models has been continuously improved, but most of these models require a large amount of computing resources, which also limits their applications in real life, such as autonomous driving, virtual reality , robots, and more. [0003] In order to solve this problem, model compression is a common idea, which can usually be divided into the following three categories: quantization, pruning and knowledge distillation. Among them, the idea of ​​knowle...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06T7/11G06K9/46G06K9/62
CPCG06T7/11G06T2207/10004G06V10/40G06F18/214G06F18/2415
Inventor 许永超王裕康周维白翔
Owner HUAZHONG UNIV OF SCI & TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products