Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network compiling method and system, computer storage medium and compiling equipment

A technology of neural network and compilation method, which is applied in the direction of neural learning method, biological neural network model, neural architecture, etc., and can solve the problems of detailed algorithm invisible to users, poor transferability, poor flexibility of optimization algorithm, etc.

Active Publication Date: 2021-03-19
SHANGHAI JIAO TONG UNIV
View PDF11 Cites 3 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In view of the shortcomings of the prior art described above, the purpose of the present invention is to provide a neural network compilation method, system, computer storage medium and compilation equipment, which are used to solve the problem of high encapsulation and limited interfaces open to users in the prior art. , causing inconvenience in debugging and parameter adjustment, the optimization process and detailed algorithms are invisible to users, and cannot support users to do further optimization. The optimization algorithm is less flexible. The terminal is poorly optimized for different hardware, and the portability is poor, requiring the intervention of more human experts

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network compiling method and system, computer storage medium and compiling equipment
  • Neural network compiling method and system, computer storage medium and compiling equipment
  • Neural network compiling method and system, computer storage medium and compiling equipment

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0035] The invention provides a method for compiling a neural network, comprising:

[0036] Translate network files into intermediate expression files;

[0037] Optimizing the intermediate expression file from the perspective of performance analysis, single node and multi-node collaboration;

[0038] Generate the optimized intermediate expression file into a network template file based on the hardware interface;

[0039] Compiling the network template file into an executable reasoning application.

[0040] The method for compiling the neural network provided by this embodiment will be described in detail below with reference to illustrations. The neural network compilation method described in this embodiment provides end-to-end inference services for users, and generates template files based on target hardware interfaces from existing, packaged network files, and then generates executable inference applications. The optimization process can optimize the execution efficiency...

Embodiment 2

[0085] This embodiment provides a neural network compilation system, including:

[0086] The translation module is used for translating network files into intermediate expression files;

[0087] An optimization module, configured to optimize the intermediate expression file from the perspective of performance analysis, single node and multi-node collaboration;

[0088] A file generation module, configured to generate a network template file based on a hardware interface from the optimized intermediate expression file;

[0089] The compilation module is used to compile the network template file into an executable reasoning application.

[0090] The neural network compiling system provided by this embodiment will be described in detail below with reference to diagrams. please participate image 3 , is a schematic structural diagram of a neural network compiling system in an embodiment. Such as image 3 As shown, the compilation system 3 of the neural network includes a tran...

Embodiment 3

[0118] This embodiment provides a compiling device, including: a processor, a memory, a transceiver, a communication interface or / and a system bus; the memory and the communication interface are connected to the processor and the transceiver through the system bus and complete mutual communication, and the memory uses The computer program is stored, the communication interface is used to communicate with other devices, the processor and the transceiver are used to run the computer program, so that the compiling device executes the above steps of the neural network compiling method.

[0119] The system bus mentioned above may be a Peripheral Component Interconnect (PCI for short) bus or an Extended Industry Standard Architecture (EISA for short) bus or the like. The system bus can be divided into address bus, data bus, control bus and so on. For ease of representation, only one thick line is used in the figure, but it does not mean that there is only one bus or one type of bus....

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention provides a neural network compiling method and system, a computer storage medium and compiling equipment. The neural network compiling method comprises the following steps: translating anetwork file into an intermediate expression file; optimizing the intermediate expression file from the perspective of performance analysis and single-node and multi-node cooperation; generating a network template file based on a hardware interface from the optimized intermediate expression file; and compiling the network template file into an executable reasoning application. The invention aimsto design and achieve a compiling tool chain framework capable of automatically adjusting parameters and generating codes according to software and hardware information, intermediate representation and a corresponding optimization algorithm, so that when the compiling tool chain framework is used for calculating on a target chip and a network output result is not changed, a higher calculation rateand a smaller calculation time delay can be obtained in a shorter optimization time. A user can conveniently debug and adjust parameters by himself / herself.

Description

technical field [0001] The invention belongs to the technical field of neural networks, relates to a compiling method, in particular to a compiling method, system, computer storage medium and compiling equipment of a neural network. Background technique [0002] Today, the development of neural networks has greatly promoted the development of machine learning and artificial intelligence and related industries, such as face recognition, speech recognition, online translation, automatic driving and other technologies. However, due to the huge network structure and calculation amount of neural network, the large delay is the main obstacle affecting its large-scale industrial production. Therefore, how to reduce the operation delay and improve the calculation speed of neural network is an important issue in the development of neural network. [0003] When compiling, most of the existing neural network compilation and optimization tools receive network files provided by users an...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N3/08G06N3/04
CPCG06N3/08G06N3/045
Inventor 刘子汉冷静文陆冠东陈全李超过敏意
Owner SHANGHAI JIAO TONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products