Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network compression and acceleration method, storage equipment and terminal

A neural network and network technology, applied in the computer field, can solve the problems of high computing cost, unable to effectively reduce neural network computing resources and storage space, and achieve the effects of saving computing resources, reducing computing costs, and saving training time

Active Publication Date: 2018-04-20
广州方硅信息技术有限公司
View PDF12 Cites 40 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the shortcomings of the existing methods, the present invention proposes a neural network compression and acceleration method, storage device and terminal to solve the problems in the prior art that the calculation resources and storage space of the neural network cannot be effectively reduced, and the calculation cost is relatively high problems to reduce neural network computing resources and storage space, thereby reducing computing costs

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network compression and acceleration method, storage equipment and terminal
  • Neural network compression and acceleration method, storage equipment and terminal
  • Neural network compression and acceleration method, storage equipment and terminal

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0053] Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals designate the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary only for explaining the present invention and should not be construed as limiting the present invention.

[0054] Those skilled in the art will understand that unless otherwise stated, the singular forms "a", "an", "said" and "the" used herein may also include plural forms. It should be further understood that the word "comprising" used in the description of the present invention refers to the presence of said features, integers, steps, operations, elements and / or components, but does not exclude the presence or addition of one or more other features, Integers, steps, operations, elements, components, and / or groups thereof.

[0055] Those skil...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a neural network compression and acceleration method, storage equipment and a terminal. The method comprises the steps that the original neural network is pruned; clustering quantification is carried out on a network weight of the pruned original network, and the original network after clustering quantification is trained to obtain a target neural network; a sparse matrix isused to store the target neural network; an input feature map is converted into an input matrix; and the sparse matrix is multiplied by the input matrix to obtain an output feature map correspondingto the input feature map. Thus, the computing resource and storage space of the neural network are reduced, and thus, the computing cost is reduced.

Description

technical field [0001] The present invention relates to the field of computer technology, in particular, the present invention relates to a neural network compression and acceleration method, storage device and terminal. Background technique [0002] With the development of neural network models, in order to solve increasingly difficult problems such as classification, recognition and detection, deeper and larger neural network models have begun to be applied to such problems. For example, for deep learning algorithms that are currently widely used in artificial intelligence, the deep network structure is deep, and its calculation amount and model are large, so more computing resources and storage space are required. However, in production applications, server computing resources are becoming more and more scarce, the speed requirements are getting higher and higher, and the demand for porting to mobile terminals is becoming more and more urgent. Compression and test accele...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08G06T1/20G06K9/62
CPCG06N3/082G06T1/20G06F18/2136G06F18/23213G06F18/24
Inventor 杨达坤曾葆明
Owner 广州方硅信息技术有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products