Deep neural network hardware accelerator based on power exponent quantization
A technology of deep neural network and hardware accelerator, applied in the direction of biological neural network model, neural architecture, neural learning method, etc., can solve the problems of complex processor circuit, high power consumption, huge storage space, etc., to improve computing speed, high Calculation performance and the effect of reducing the amount of calculation
- Summary
- Abstract
- Description
- Claims
- Application Information
AI Technical Summary
Problems solved by technology
Method used
Image
Examples
Embodiment Construction
[0023] The technical solution of the invention will be described in detail below in conjunction with the accompanying drawings.
[0024] The deep neural network accelerator hardware structure that the present invention designs is as figure 1 As shown, the PE array size is 16*16, the convolution kernel size is 3*3, and the convolution kernel step size is 1 as an example to illustrate its working method: the accelerator quantizes the input data and the power exponent weight data through the AXI-4 bus Cache to the input buffer area and the weight buffer area. According to the size of the PE array, the AXI-4 bus needs to read 16 convolution kernel data from the DDR and store them in the weight buffer area, so as to input 16 weight index values to the PE array in parallel. The convolution kernel data stored in the weight buffer is read to the encoding module, and then the weight data is encoded in a certain way according to the positive or negative of the weight data quantized ...
PUM
Abstract
Description
Claims
Application Information
- R&D Engineer
- R&D Manager
- IP Professional
- Industry Leading Data Capabilities
- Powerful AI technology
- Patent DNA Extraction
Browse by: Latest US Patents, China's latest patents, Technical Efficacy Thesaurus, Application Domain, Technology Topic, Popular Technical Reports.
© 2024 PatSnap. All rights reserved.Legal|Privacy policy|Modern Slavery Act Transparency Statement|Sitemap|About US| Contact US: help@patsnap.com