Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network compiling optimization method and system

A neural network and optimization method technology, applied in the optimization method and system of neural network compilation, compiler and compilation process, can solve the problems of increasing memory burden, increasing system burden, and not significantly improving system execution efficiency

Pending Publication Date: 2021-04-27
BEIJING TSINGMICRO INTELLIGENT TECH CO LTD
View PDF0 Cites 14 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, while this method improves the system execution speed, it also increases the memory burden, which does not significantly improve the system execution efficiency, but also increases the system burden.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network compiling optimization method and system
  • Neural network compiling optimization method and system
  • Neural network compiling optimization method and system

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0055]In order to understand the technical features, objects and effects of the invention, the specific embodiments of the present invention will be described with reference to the drawings, and the same reference numerals are similar or structurally, but the same components are the same.

[0056]Herein, "schematic" means "acting as an example, example or description", which is not described herein as "schematic", and the embodiment is interpreted as a more preferred or more advantageous Technical solutions. In order to make the drawing simple, only the portions related to the present exemplary embodiments are schematically representative, which does not represent its actual structure and true ratio of the product.

[0057]In one embodiment of the optimization method compiled by the neural network of the present invention, the present invention mainly includes the following portions: IR conversion, interlayer fusion scheme search module, block and single-operator scheduling module, cost m...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a neural network compiling optimization method. The method comprises the steps of obtaining a calculation graph with a set data structure according to a to-be-compiled deep learning model; fusing the one or more preprocessing layers into a plurality of fusion layers; obtaining an operator calculation sequence in the fusion layer according to the inter-operator dependency relationship in the fusion layer; obtaining input and output calling times and a splitting strategy of an in-layer operator; obtaining corresponding system overhead values of the plurality of fusion layers on the simulation hardware platform; taking the fusion layer corresponding to the minimum value in the system overhead values of the plurality of fusion layers as the current fusion layer; and compiling the to-be-compiled deep learning model according to the current fusion layer. By fusing multiple layers of operators of the neural network, a calculated intermediate result is stored on a chip instead of being read and written through a memory, so that the memory access requirement can be effectively reduced, and the execution efficiency of the system is improved. Meanwhile, the invention further provides an optimization system for neural network compilation.

Description

Technical field[0001]The present invention relates to a reconfigurable processor and application field, and is applied to a compiler and compilation process of a reconfigurable compiler. The present invention specifically relates to an optimization method and system of neural network compilation.Background technique[0002]Depth neural networks have been applied to multiple fields, such as face recognition, machine translation, recommendation system and other fields. With the improvement of the complexity of the depth neural network model, the complexity of the calculated complexity is also increased while obtaining a better effect on the corresponding task. In order to improve the calculation efficiency of the depth neural network, it can make the corresponding task more efficiently, and it is necessary to optimize the calculation of complex neural networks based on compilation optimization techniques of neural networks.[0003]In the current computer architecture, the execution speed ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06F8/41G06N3/04G06N3/063G06N3/08
CPCG06F8/443G06N3/04G06N3/063G06N3/08
Inventor 欧道理郑时轩欧阳鹏
Owner BEIJING TSINGMICRO INTELLIGENT TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products