Knowledge distillation method and device fusing channel and relation feature learning, and equipment

A technology of feature learning and distillation method, which is applied in the field of knowledge distillation that integrates channel and relational feature learning. It can solve the problems of ignoring relational feature knowledge, not considering channel-related knowledge on convolutional layers, and unable to effectively improve student network performance. , to achieve the effect of improving performance and improving performance

Inactive Publication Date: 2021-09-03
JIANGSU UNIV
View PDF0 Cites 5 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the above method does not take into account the relevant knowledge of the channel on the convolutional layer, and also ignores the relationship feature knowledge of different samples in the channel.
Moreover, during the learning process, the student network will be affected by the wrong knowledge of the teacher, so that the performance of the student network cannot be effectively improved in the later stage of training the student network.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Knowledge distillation method and device fusing channel and relation feature learning, and equipment
  • Knowledge distillation method and device fusing channel and relation feature learning, and equipment
  • Knowledge distillation method and device fusing channel and relation feature learning, and equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] In order to make the objects, technical solutions, and advantages of the present disclosure, the technical solutions of the present disclosure will be described in contemplation in the drawings of the embodiments of the present disclosure. Obviously, the described embodiment is a part of the embodiments of the present disclosure, not all of the embodiments. Based on the embodiments described herein, all other embodiments obtained without the need for creative labor without the need for creative labor, in the embodiment of the present disclosure.

[0025] The purpose of this application is to provide a knowledge distillation method, apparatus and apparatus for fusion channel and relationship characteristics. The method includes constructing unstoped student networks and completing pre-training teachers network; input training data The teacher network obtains the output of the student network, the output of the teacher network, and the training data also includes the correspo...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention relates to the technical field of knowledge distillation, and discloses a knowledge distillation method and device fusing channel and relation feature learning, and equipment, and the method comprises the steps: constructing an untrained student network and a pre-trained teacher network; respectively inputting the training data into a student network and a teacher network to obtain an output result of the student network and an output result of the teacher network, wherein the training data further comprises corresponding real label data; determining a distillation loss function based on the channel data of the student network and the teacher network, the output result of the student network, the output result of the teacher network, and the relationship between the learning network and the teacher network migration sample; and performing iterative training on the student network based on the distillation loss function. According to the method and device, the student network model can be effectively compressed, and the performance of the student network can be further improved and even exceeds the performance of a large teacher network.

Description

Technical field [0001] The present invention relates to the field of knowledge distillation, and specifically, a knowledge distillation method, apparatus, and apparatus for fusion channels and relationship characteristics. Background technique [0002] Deep learning technology has developed very rapidly in recent years, and there is also a breakthrough in computer vision and natural language processing. From 2012, AlexNet proposed in 2016, these neural networks have powerful performance and break through traditional algorithms in the image classification task. More deeper neural networks can extract more information and have a better representation. However, deeper, more complex neural networks require more power and reason to meet real-time responses in practical industrial applications, and in real time in real-time in mobile terminals cannot accept a large number of network parameters and calculations. AlexNet and Densenet neural networks only consider how to improve accuracy,...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
IPC IPC(8): G06N5/02G06N3/04G06N3/08
CPCG06N5/025G06N3/08G06N3/045
Inventor 苟建平熊祥硕陈潇君夏书银欧卫华柯佳
Owner JIANGSU UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products