Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Blocked convolution optimization method and device for convolution neural network

A technology of convolutional neural network and optimization method, applied in the direction of neural learning method, biological neural network model, neural architecture, etc., can solve problems such as processing bottlenecks, achieve the effects of improving efficiency, alleviating resource constraints, and reducing delay

Active Publication Date: 2017-12-05
INST OF AUTOMATION CHINESE ACAD OF SCI
View PDF6 Cites 48 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0009] In order to solve the above problems in the prior art, that is, in order to solve the bottleneck problem of convolution processing in the neural network in the hardware processing system, one aspect of the present invention proposes a block convolution optimization method for convolutional neural networks , including the following steps:

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Blocked convolution optimization method and device for convolution neural network
  • Blocked convolution optimization method and device for convolution neural network
  • Blocked convolution optimization method and device for convolution neural network

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0049] Preferred embodiments of the present invention are described below with reference to the accompanying drawings. Those skilled in the art should understand that these embodiments are only used to explain the technical principles of the present invention, and are not intended to limit the protection scope of the present invention.

[0050] The block convolution optimization method of the convolutional neural network of the embodiment of the present invention, such as figure 1 shown, including:

[0051] Step 1. Based on the preset convolutional neural network model, select the convolutional layer to be divided into blocks, and determine the upper limit of the block size of the convolutional layer;

[0052] Step 2, according to the input feature map size and the upper limit of the block size obtained in step 1, determine the number of blocks and the block size of the input feature map of the convolutional layer to be block-convolved;

[0053] Step 3, based on the number o...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to the field of deep neural networks and provides a blocked convolution optimization method and a device for a convolution neural network, so as to solve the bottleneck problem of convolution operation in a hardware processing system in the neural network. The optimization method comprises steps: a to-be-blocked convolution layer is selected, and the upper limit of the block size is determined; according to the upper limit of the block size, a block number and the block size of an input feature map are determined; based on the block number, the block size, the size of a convolution kernel, the size of the input feature map and the filling size of an input feature map boundary, the block boundary filling size of a block feature map is calculated; and based on the block number, the block size and the block boundary filling size, a convolution based on the block boundary filling is built to replace the original convolution. The resource constraint problem of the convolution neural network during operation of an embedded hardware platform is greatly alleviated, the burst length is improved maximally when a memory is read and written, the throughput is improved, the time delay is reduced, and the efficiency is improved.

Description

technical field [0001] The invention relates to the technical field of deep neural networks, in particular to a block convolution optimization method and device for convolutional neural networks. Background technique [0002] Deep learning, as a cutting-edge branch of machine learning, has developed rapidly in theory and application in recent years. Driven by deep learning, traditional fields such as computer vision and speech and language processing are developing rapidly. Computers can even surpass humans in recognizing images, videos, and speech and text. A number of emerging industries and applications have emerged in the wave of deep learning development, such as self-driving cars, chat robots, smart monitoring, smart homes, etc. Intelligent applications can be seen almost everywhere in people's daily lives. Driven by big data and deep learning, traditional retail, banking, and insurance industries have entered a new era of Internet development. [0003] Deep convolut...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08G06N3/04G06N3/063
CPCG06N3/063G06N3/082G06N3/045
Inventor 程健李钢赵天理
Owner INST OF AUTOMATION CHINESE ACAD OF SCI
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products