Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Hardware structure for realizing forward calculation of convolutional neural network

A technology of convolutional neural network and forward computing, which is applied in the field of hardware structure to realize forward computing of convolutional neural network, can solve the problems of reducing resource utilization and performance, and cannot accelerate parallel resources on the board, so as to reduce the on-chip The effect of cache requirements, large parallelism, and high processing performance

Inactive Publication Date: 2017-08-18
智擎信息系统(上海)有限公司
View PDF4 Cites 59 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] The embodiment of the present application provides a hardware structure for realizing the forward calculation of the convolutional neural network, which is used to solve the problem that the hardware acceleration of the convolutional neural network in the prior art cannot effectively and fully utilize the resources on the board for maximum parallel acceleration. Issues that reduce resource utilization and performance

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Hardware structure for realizing forward calculation of convolutional neural network
  • Hardware structure for realizing forward calculation of convolutional neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0025] In order to make the purpose, technical solution and advantages of the application clearer, the application will be further described in detail below in conjunction with the accompanying drawings. Apparently, the described embodiments are only some of the embodiments of the application, not all of them. Based on the embodiments in this application, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the scope of protection of this application.

[0026] The embodiments of the present application will be further described in detail below in conjunction with the accompanying drawings.

[0027] like figure 1 As shown, according to one aspect of the present application, a hardware structure for realizing the forward calculation of the convolutional neural network is provided. The hardware structure can be realized by using a field programmable gate array FPGA chip or an application specific integrated circuit ASIC ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present application discloses a hardware structure for realizing forward calculation of a convolutional neural network. The hardware structure comprises: a data off-chip caching module, used for caching parameter data in each to-be-processed picture that is input externally into the module, wherein the parameter data waits for being read by a multi-level pipeline acceleration module; the multi-level pipeline acceleration module, connected to the data off-chip caching module and used for reading a parameter from the data off-chip caching module, so as to realize core calculation of a convolutional neural network; a parameter reading arbitration module, connected to the multi-level pipeline acceleration module and used for processing multiple parameter reading requests in the multi-level pipeline acceleration module, so as for the multi-level pipeline acceleration module to obtain a required parameter; and a parameter off-chip caching module, connected to the parameter reading arbitration module and used for storing a parameter required for forward calculation of the convolutional neural network. The present application realizes algorithms by adopting a hardware architecture in a parallel pipeline manner, so that higher resource utilization and higher performance are achieved.

Description

technical field [0001] The present application relates to the field of computer and electronic information technology, and in particular to a hardware structure for realizing forward calculation of convolutional neural network. Background technique [0002] With the rise of artificial intelligence, deep learning has become a very popular field. It is widely used in computer vision, speech recognition and other big data applications, and has received more and more attention. As a very important algorithm model in deep learning, convolutional neural network has been widely used in image classification, face recognition, video detection, speech recognition and so on. The convolutional neural network is modeled on the nervous system in the human brain. It consists of many layers. The input information is passed from the initial input layer to the next layer through some convolution, multiplication and addition operations, activation functions, etc., to This method is passed lay...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F9/38G06F9/30G06N3/06
CPCG06F9/30007G06F9/3867G06N3/063
Inventor 曹伟黄峰孙亚洲杨贤王伶俐周学功李慧敏范锡添焦黎
Owner 智擎信息系统(上海)有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products