Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Minimizing memory reads and increasing performance by leveraging aligned blob data in a processing unit of a neural network environment

A kind of neural network and memory technology, applied to the architecture with multiple processing units, biological neural network model, general-purpose stored program computer, etc., can solve the problem of reducing the number of memory operations, reduce power consumption, reduce the use of memory, Improve the effect of human-computer interaction

Active Publication Date: 2019-11-26
MICROSOFT TECH LICENSING LLC
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Operationally, the newly aligned data results in a reduced number of memory operations

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Minimizing memory reads and increasing performance by leveraging aligned blob data in a processing unit of a neural network environment
  • Minimizing memory reads and increasing performance by leveraging aligned blob data in a processing unit of a neural network environment
  • Minimizing memory reads and increasing performance by leveraging aligned blob data in a processing unit of a neural network environment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] The following detailed description techniques described herein provide virtualization of one or more hardware iterators to be utilized in an exemplary neural network (NN) and / or deep neural network (DNN) environment, where data is physically populated in Aligning data in memory components allows manipulation of data that improves overall performance and optimizes memory management. It should be understood that the systems and methods described herein are applicable to NNs and / or DNNs, and thus, when reference is made to NN, it shall also refer to DNN and vice versa.

[0025] In an exemplary implementation, an exemplary DNN environment may include one or more processing blocks (e.g., computer processing units—CPUs), memory controllers, line buffers, high-bandwidth structures (e.g., local or external structures) (e.g., in the exemplary A data bus for passing data and / or data elements between DNN modules and cooperating components of the DNN environment), operating control...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The performance of a neural network (NN) and / or deep neural network (DNN) can be limited by the number of operations being performed as well as management of data among the various memory components of the NN / DNN. By inserting a selected padding in the input data to align the input data in memory, data read / writes can be optimized for processing by the NN / DNN thereby enhancing the overall performance of a NN / DNN. Operatively, an operations controller / iterator can generate one or more instructions that inserts the selected padding into the data. The data padding can be calculated using variouscharacteristics of the input data as well as the NN / DNN as well as characteristics of the cooperating memory components. Padding on the output data can be utilized to support the data alignment at thememory components and the cooperating processing units of the NN / DNN.

Description

Background technique [0001] When performing one or more processing operations such as convolutions on an exemplary layer of a neural network (NN) or deep neural network (DNN), reading data from memory can take a significant amount of time and processing resources consumed by the NN / DNN cost. Typically, the controller component of a NN / DNN is tasked with performing the processing operations required to iterate over large amounts of data in order to apply specific operations. Typically, some existing NNs and DNNs perform various operations including memory reads and writes to various cooperative memory components of the NN / DNN (e.g., row buffers) and perform one or more operations on layer data to Optimizing processing operations takes avoidable processing time (eg, floating point / fixed point operations per second (GFlops / s)) and memory space (eg, bytes transferred per second (GBytes / s)). [0002] Specifically, current practices do not identify key characteristics of the input...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F15/80H04L45/50
CPCG06N3/0464G06N3/063G06F2209/485G06F2209/484G06F13/1673G06F1/3275G06F12/0207G06F13/1689G06F13/28G06F9/46G06F9/5033H03M7/3059H03M7/3066H03M7/6058G06F9/5077G06F15/8007G06F2212/1016G06F2212/6026G06F9/467G06F9/5016G06N3/049G06F9/5061G06F9/4881Y02D30/50G06F1/3206G06F1/3287G06N3/045Y02D10/00G06F9/3858G06F9/3887G06F12/0862G06F3/0631G06N3/04H03M7/6011G06F12/10G06F3/067G06F9/3836G06N3/06H04L45/04G06F17/15H04L67/02G06N3/08G06F1/324G06N3/10G06F2212/657H03M7/46G06F12/08G06F3/0604H03M7/6005H04L45/50G06F9/30087H03M7/70H04L67/1001G06N3/065G06N3/0495H03M7/3088G06F12/0238
Inventor G·彼得C·B·麦克布赖德A·A·安巴德卡K·D·塞多拉B·博布罗夫L·M·瓦尔
Owner MICROSOFT TECH LICENSING LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products