Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Method for realizing neural network model splitting by using multi-core processor and related product

A neural network model, multi-core processor technology, applied in the field of deep learning

Pending Publication Date: 2020-01-14
ANHUI CAMBRICON INFORMATION TECH CO LTD
View PDF6 Cites 6 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Traditional data parallel solutions cannot meet the requirements of small data and low latency for accelerators in inference scenarios

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Method for realizing neural network model splitting by using multi-core processor and related product
  • Method for realizing neural network model splitting by using multi-core processor and related product
  • Method for realizing neural network model splitting by using multi-core processor and related product

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0083] The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.

[0084] It should be understood that the terms "first", "second" and "third" in the claims, specification and drawings of the present disclosure are used to distinguish different objects, rather than to describe a specific sequence. The terms "comprising" and "comprises" used in the specification and claims of this disclosure indicate the presence of described features, integers, steps, operations, elements and / or components, but do not exclude one or more other features, integers , steps, operations, elements, components, and / or the presence or addition of collections thereof.

[0085] It should also be understood that the terminology used in this disclosure description is for the purpose of describing specific embodiments only, and is not intended to limit the present disclosure. As used in this d...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses a method for realizing neural network model splitting by using a multi-core processor and a related product. When operators capable of being split exist in the neural network model, the operators are split, the optimal splitting combination is selected to obtain the optimal splitting result of the whole neural network model, then the sub-operators corresponding to the optimal splitting result are executed in parallel through multiple cores, and the purpose of reducing resource consumption of computer equipment is achieved.

Description

technical field [0001] The invention relates to the technical field of deep learning, in particular to a method for splitting a neural network model by using a multi-core processor and related products. Background technique [0002] In recent years, neural network processors have been continuously proposed, and like general-purpose processors, they are expanding from single-core to multi-core. This extended multi-core structure can support data parallelism in the training phase to improve data throughput and speed up training. However, in the inference phase, deep neural networks have higher requirements for end-to-end latency than throughput, which often determines the availability of accelerators in a certain scenario. Traditional data parallel solutions cannot meet the requirements of small data and low latency for accelerators in inference scenarios. Contents of the invention [0003] In order to achieve the above purpose, in the first aspect, the embodiment of the p...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/063G06N3/04G06N3/08
CPCG06N3/063G06N3/08G06N3/045G06F15/80G06N3/04
Inventor 不公告发明人
Owner ANHUI CAMBRICON INFORMATION TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products