Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Training neural network accelerators using mixed precision data formats

A neural network and multi-layer neural network technology, which is applied in the field of training neural network accelerators using mixed-precision data formats, can solve the problem of large calculations using models

Pending Publication Date: 2021-07-30
MICROSOFT TECH LICENSING LLC
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, using the model can be computationally intensive, so it may not be possible to perform feature extraction in real time using a general-purpose processor

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Training neural network accelerators using mixed precision data formats
  • Training neural network accelerators using mixed precision data formats
  • Training neural network accelerators using mixed precision data formats

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0013] general considerations

[0014] The present invention is illustrated in the context of representative embodiments, which are not intended to be limiting in any way.

[0015] As used in this application, the singular forms "a," "an," and "the" include plural referents unless the context clearly dictates otherwise. Additionally, the term "includes" means "comprises". Furthermore, the term "coupled" includes mechanical, electrical, magnetic, optical and other practical means of coupling or linking items together and does not exclude the presence of intervening elements between the coupled items. Also, as used herein, the term "and / or" means any one or combination of items in a phrase.

[0016] The systems, methods and devices described herein should not be construed as limiting in any way. Rather, the present disclosure relates to all novel and non-obvious features and aspects of the various disclosed embodiments both individually and in various combinations and subcomb...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Technology related to training a neural network accelerator using mixed precision data formats is disclosed. In one example of the disclosed technology, a neural network accelerator is configured to accelerate a given layer of a multi-layer neural network. An input tensor for the given layer can be converted from a normal-precision floating-point format to a quantized-precision floating-point format. A tensor operation can be performed using the converted input tensor. A result of the tensor operation can be converted from the block floating-point format to the normal-precision floating-point format. The converted result can be used to generate an output tensor of the layer of the neural network, where the output tensor is in normal-precision floating-point format.

Description

Background technique [0001] Machine learning (ML) and artificial intelligence (AI) techniques can be used to solve many complex computing problems, such as recognizing images and speech, analyzing and classifying information, and performing various classification tasks. Machine learning is the field of computer science that uses statistical techniques to give computer systems the ability to extract higher-level features from a set of training data. Specifically, features can be extracted by training models such as artificial neural network (NN) or deep neural network (DNN). After the model is trained, new data can be applied to the model, and the new data can be classified using the trained model (eg, higher level features can be extracted). Machine learning models are typically executed on a general-purpose processor, also known as a central processing unit (CPU). However, using the model can be computationally intensive, so it may not be possible to perform feature extract...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/063G06N3/04
CPCG06N3/063G06N3/045G06F17/15G06F17/16G06N3/084G06N20/00
Inventor B·D·鲁哈尼T·纳E·S·钟D·洛D·C·伯格
Owner MICROSOFT TECH LICENSING LLC
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products