Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network computing module, method and communication device

A neural network and computing module technology, applied in the field of neural network computing modules, can solve problems such as multiple reading frequencies, inability to reuse efficiently, and huge data volume, and achieve the effect of optimizing computing speed and reducing cache requirements

Active Publication Date: 2021-12-28
绍兴埃瓦科技有限公司
View PDF5 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

The traditional convolution acceleration operation device needs to use the img2col method to expand the input feature map data and convolution kernel data in matrix form according to the convolution kernel size and step size parameters, and then perform operations on the expanded matrix, so that it can be calculated according to the matrix The multiplication operation rule performs convolution acceleration, but this method requires a larger on-chip cache after the feature data matrix is ​​expanded, and similarly requires more off-chip main memory reading frequency and cannot efficiently multiplex the read data. It needs to occupy the read and write bandwidth of the off-chip memory and increase the hardware power consumption. At the same time, the convolution acceleration method based on the img2col expansion method is not conducive to the hardware logic circuit implementation of the convolution operation of different sizes of convolution kernels and step sizes. Therefore, In the process of convolution network operation, each input channel needs to perform convolution matrix operation with multiple convolution kernels, and the feature map data needs to be acquired multiple times; and all feature map data on each channel are all cached in the buffer , the amount of data is not only huge, but also when the convolution matrix is ​​calculated, since the size of the feature data after matrix conversion is far greater than the size of the original feature data, not only will the on-chip storage resources be wasted, so it is impossible to perform large-scale operations

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network computing module, method and communication device
  • Neural network computing module, method and communication device
  • Neural network computing module, method and communication device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0026] Embodiments of the present application will be described in detail below in conjunction with the accompanying drawings.

[0027] Embodiments of the present application are described below through specific examples, and those skilled in the art can easily understand other advantages and effects of the present application from the content disclosed in this specification. Apparently, the described embodiments are only some of the embodiments of this application, not all of them. The present application can also be implemented or applied through other different specific implementation modes, and various modifications or changes can be made to the details in this specification based on different viewpoints and applications without departing from the spirit of the present application. It should be noted that, in the case of no conflict, the following embodiments and features in the embodiments can be combined with each other. Based on the embodiments in this application, all...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a neural network calculation module, method and communication equipment, belonging to the field of data processing, specifically including a data controller, a data extractor, a first shift register group and a neural network calculation unit, the data controller according to the configuration information and instruction information to adjust the data path, and control the data extractor to extract feature row data and convolution kernel row data from the feature map data of the image to be processed by row; the first shift register group adopts serial input and parallel output output the feature row data to the neural network computing unit; the neural network computing unit performs multiplication and accumulation operations on the input feature row data and the convolution kernel row data to complete a convolution The convolution operation of the kernel and the feature map data, and the accumulation of multiple convolution results are completed in at least one cycle, so as to realize circuit reconstruction and data multiplexing.

Description

technical field [0001] The invention relates to the field of data processing, in particular to a neural network calculation module, method and communication equipment. Background technique [0002] A convolutional neural network consists of an input layer (input layer), any number of hidden layers (hidden layers) as intermediate layers, and an output layer (output layer). The input layer (inputlayer) has multiple input nodes (neurons). The output layer has output nodes (neurons) that recognize the number of objects. [0003] The convolution kernel is a small window set in the hidden layer, which holds the weight parameters. The convolution kernel slides sequentially on the input image according to the step size, and performs multiplication and addition operations with the input feature image of the corresponding area, that is, the weight parameter in the convolution kernel and the value of the corresponding input image are first multiplied and then summed. The traditional...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/063G06N3/04
CPCG06N3/063G06N3/045
Inventor 王赟张官兴郭蔚黄康莹张铁亮
Owner 绍兴埃瓦科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products