Eureka AIR delivers breakthrough ideas for toughest innovation challenges, trusted by R&D personnel around the world.

Power weight quantification-based nerve network forward operation hardware structure

A neural network and hardware structure technology, applied in computing, digital data processing parts, instruments, etc., can solve the problems of increasing the complexity of neural network operations, not showing advantages in computing overhead, and high computing overhead, so as to reduce computing resource overhead , Reduce computing overhead, reduce the effect of storage capacity

Inactive Publication Date: 2016-11-02
HUAWEI TECH CO LTD +1
View PDF3 Cites 71 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Although this method can greatly reduce the storage capacity by reducing the code length, the low-bit code needs to be re-decoded into a high-bit floating-point number during calculation, so a floating-point multiplier is still required, and its computational overhead is still very high. High; in addition, with the operation of low-bit encoding and decoding into high-bit floating-point numbers, an additional decoder needs to be designed
[0005] Although the current neural network weight quantization method can effectively reduce the storage capacity of the network weight, it does not show considerable advantages in reducing the computational overhead of the neural network, and even the additional introduction of the decoder increases the computational complexity of the neural network. Spend

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Power weight quantification-based nerve network forward operation hardware structure
  • Power weight quantification-based nerve network forward operation hardware structure
  • Power weight quantification-based nerve network forward operation hardware structure

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] Embodiments of the present invention are described in detail below, examples of which are shown in the drawings, wherein the same or similar reference numerals designate the same or similar elements or elements having the same or similar functions throughout. The embodiments described below by referring to the figures are exemplary only for explaining the present invention and should not be construed as limiting the present invention.

[0025] The hardware structure of the neural network forward operation based on power weight quantization according to the embodiment of the present invention will be described below with reference to the accompanying drawings.

[0026] figure 1 It is a structural block diagram of a neural network forward operation hardware structure based on power weight quantization according to an embodiment of the present invention. figure 2 It is a circuit structure diagram of a neural network forward operation hardware structure based on power wei...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a power weight quantification-based nerve network forward operation hardware structure comprising the following elements: an input buffer memory module used for caching input data and nerve network power weight transferred to a chip from a global memory, wherein the nerve network power weight comprises a convolution core and a full-connecting matrix; a computing module used for carrying out convolution and full-connecting matrix vector multiplying operation based on power weight quantification; an output buffer memory module used for caching the convolution and full-connecting matrix vector multiplying operational result obtained by the computing module; a control module used for carrying out data transfer and operational resource scheduling for the input buffer memory module, the computing module and the output buffer memory module. The power weight quantification-based nerve network forward operation hardware structure can effectively reduce the nerve network memory space, and can effectively reduce nerve network operational expenses, thus obviously improving the application space of the nerve network calculating system on the terminal equipment.

Description

technical field [0001] The invention relates to the technical fields of computer and electronic information, in particular to a forward computing hardware structure of a neural network based on power weight quantization. Background technique [0002] With the continuous increase in the scale of deep neural networks (storage capacity of hundreds of MB) and computation (single forward run of 10GFPLOS), existing smart terminal devices (such as mobile phones) are no longer able to support large-scale neural networks. Therefore, how to effectively reduce the amount of storage and calculation of deep neural network applications becomes very important. By means of weight quantization, or converting the weights into fixed-point numbers, or establishing a quantized weight codebook, the model can be effectively compressed and the storage capacity of the neural network can be reduced. Therefore, how to design an effective quantization method, and then design an efficient hardware stru...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F7/523
CPCG06F7/523
Inventor 汪玉唐天琪费旭东杨华中
Owner HUAWEI TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Eureka Blog
Learn More
PatSnap group products