Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for obtaining neural network model after adversarial distillation and computing device

A neural network model and computing equipment technology, applied in the field of anti-distillation neural network models, methods and computing equipment, can solve the problem that the processing method does not achieve the expected effect, achieve the effect of improving generalization ability and reducing error rate

Active Publication Date: 2018-05-01
XIAMEN MEITUZHIJIA TECH
View PDF4 Cites 24 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, due to the diversity of adversarial sample construction, etc., this processing method did not achieve the expected effect.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for obtaining neural network model after adversarial distillation and computing device
  • Method for obtaining neural network model after adversarial distillation and computing device
  • Method for obtaining neural network model after adversarial distillation and computing device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. Although exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited by the embodiments set forth herein. Rather, these embodiments are provided for more thorough understanding of the present disclosure and to fully convey the scope of the present disclosure to those skilled in the art.

[0027] figure 1 is a block diagram of an example computing device 100 . In a basic configuration 102 , computing device 100 typically includes system memory 106 and one or more processors 104 . A memory bus 108 may be used for communication between the processor 104 and the system memory 106 .

[0028] Depending on the desired configuration, processor 104 may be any type of processor including, but not limited to, a microprocesso...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention discloses a method for obtaining a neural network model after adversarial distillation. The neural network model has a forward network with a feature layer structure and a softmax layer outputting probability vectors in multiple classes. The method is suitable for execution in a computing device, and comprises the steps of: adding a zooming layer between the forward network of an original neural network model and the softmax layer according to a distillation temperature, and generating a first neural network model; employing a first tag of a training sample itself to train the first neural network model, and obtaining a second neural network model; inputting the training sample into the second neural network model, and outputting a second tag expressing the probability vector of the training sample in the multiple classes through the softmax layer; employing the second tag and the first tag to perform constraint training of the second neural network model, and obtaining a third neural network model; and deleting a zooming layer in the third neural network model, and obtaining a neural network model after adversarial distillation. The present invention furtherdiscloses a corresponding computing device.

Description

technical field [0001] The invention relates to the technical field of image processing, in particular to a method and computing equipment for resisting a distillation neural network model. Background technique [0002] Deep neural networks can always achieve very accurate results on today's classification and regression problems. With the support of massive data, the trained deep neural network models also have strong generalization capabilities. Therefore, in recent years, deep neural networks have been used in Computer vision, speech recognition and other aspects have been widely used. However, these deep neural network models also have some defects and loopholes in practical applications. For example, when the structure and parameters of the network model are not clear, special small disturbances are made to the input of the network. These will not affect the judgment subjectively, but can make the output of the network model have a high degree of confidence. The wrong...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08G06N3/04
CPCG06N3/082G06N3/045
Inventor 陈良洪炜冬张伟许清泉王喆
Owner XIAMEN MEITUZHIJIA TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products