Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Model compression method and device, computing equipment and storage medium

A compression method and model technology, applied in computing, biological neural network model, image enhancement and other directions, can solve the problem of low model pruning accuracy

Active Publication Date: 2021-06-08
PING AN TECH (SHENZHEN) CO LTD
View PDF10 Cites 4 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Embodiments of the present invention provide a model compression method, device, computing device, and storage medium to solve the current problem of low accuracy of model pruning due to the importance of evaluating feature channels based on numerical values ​​in feature maps

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Model compression method and device, computing equipment and storage medium
  • Model compression method and device, computing equipment and storage medium
  • Model compression method and device, computing equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0028] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are some of the embodiments of the present invention, but not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0029] The model compression method can be applied in such as figure 1 An application environment in which a computer device communicates with a server over a network. Computer equipment can be, but is not limited to, various personal computers, laptops, smartphones, tablets, and portable wearable devices. The server can be implemented as an independent server.

[0030] In one embodiment, as figure 2 As shown, a model compression method is provided...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the technical field of artificial intelligence, in particular to a model compression method and device, equipment and a storage medium. The model compression method comprises the following steps: acquiring a test image and a to-be-compressed model, wherein the to-be-compressed model comprises a plurality of cascaded feature extraction layers; inputting a test image into each feature extraction layer, and performing feature extraction on the test image through a filter to obtain a multi-channel feature map, wherein each feature channel in the multi-channel feature map corresponds to one sub-feature map; converting the sub-feature map into a visual feature map, and determining the importance of a feature channel corresponding to the sub-feature map based on a plurality of feature values in the visual feature map; determining a target pruning channel according to the importance degree of the feature channel; and performing pruning processing on the target pruning channel to obtain a compressed target model. According to the method, the model pruning accuracy can be effectively improved, accurate compression of the model is realized, the model calculation amount can be effectively reduced, and deployment is convenient.

Description

technical field [0001] The present invention relates to the technical field of artificial intelligence, in particular to a model compression method, device, computing equipment and storage medium. Background technique [0002] In recent years, models based on convolutional neural networks have performed well in many tasks, but these models require a large amount of computational overhead, and these models often contain a large amount of redundant information, so model compression becomes essential. step. Commonly used model compression methods include model pruning, quantization, and distillation. [0003] For the current model pruning operation, the main problem is how to select relatively unimportant feature channels and remove them. The existing methods for how to select relatively unimportant convolution kernels are generally as follows: one is to use the proportion of zero values ​​​​in the feature map, and then use the feature channel corresponding to the feature map...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06T5/20G06N3/04
CPCG06T5/20G06N3/045
Inventor 王晓锐郑强高鹏
Owner PING AN TECH (SHENZHEN) CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products