Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network compiling method, system, computer storage medium and compiling device

A technology of neural network and compiling method, which is applied in the direction of neural learning method, biological neural network model, neural architecture, etc., can solve the problems of limited interface, large optimization space, high packaging degree, etc., and achieve convenient debugging, high calculation rate and small calculation The effect of delay

Active Publication Date: 2022-03-18
SHANGHAI JIAOTONG UNIV
View PDF11 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0006] In view of the shortcomings of the prior art described above, the purpose of the present invention is to provide a neural network compilation method, system, computer storage medium and compilation equipment, which are used to solve the problem of high encapsulation and limited interfaces open to users in the prior art. , causing inconvenience in debugging and parameter adjustment, the optimization process and detailed algorithms are invisible to users, and cannot support users to do further optimization. The optimization algorithm is less flexible. The terminal is poorly optimized for different hardware, and the portability is poor, requiring the intervention of more human experts

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network compiling method, system, computer storage medium and compiling device
  • Neural network compiling method, system, computer storage medium and compiling device
  • Neural network compiling method, system, computer storage medium and compiling device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0035] The present invention provides a compilation method of a neural network, including:

[0036] Translate network files into an intermediate expression file;

[0037] Optimize the intermediate expression file from performance analysis, single node, and multi-node synergies;

[0038] Generate a network template file based on a hardware interface after optimized intermediate expression file;

[0039] Compile the network template file to an executive reasoning application.

[0040] The compilation method of the neural network provided in this embodiment will be described in conjunction with the illustration. The compilation method of the neural network of the present embodiment provides the user to provide an end-to-end reasoning service, generated a template file based on the target hardware interface, and generates executable reasoning applications by an existing, packaged network file. The optimization process is optimized to generate code execution efficiency.

[0041] See fi...

Embodiment 2

[0085] This embodiment provides a compilation system of a neural network, including:

[0086] Translation module is used to translate network files into an intermediate expression file;

[0087] Optimization module for optimizing the intermediate expression file from performance analysis, single node, and multi-node synergies;

[0088] File generation module, used to generate a network template file based on hardware interface after optimized intermediate expression files;

[0089] Compilation module is used to compile the network template file as an executive reasoning application.

[0090] The compilation system of the neural network provided in this embodiment will be described in conjunction with the illustration. Please participate image 3 Schematic diagram of the principle of compilation systems showing a neural network in an embodiment. Such as image 3 As shown, the compilation system 3 of the neural network includes a translation module 31, an optimization module 32, a fil...

Embodiment 3

[0118] This embodiment provides a compilation device, including: processor, memory, transceiver, communication interface, or / and system bus; memory and communication interface through the system bus and the processor and transceiver, memory, memory The storage computer program, the communication interface is used to communicate with other devices, processors, and transceivers to run computer programs, enabling compilation devices to perform various steps of compilation methods such as neural networks.

[0119] The above-mentioned system bus can be peripheral components (PCI) bus or extended Industry StandardarchAture, referred to as EISA buses. The system bus can be divided into address bus, data bus, control bus, etc. For ease of expression, only one coarse line table is used, but only one bus or a type of bus is not shown. Communication interfaces are used to implement communication between database access devices and other devices (such as clients, read and write libraries, a...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The present invention provides a neural network compiling method, system, computer storage medium and compiling equipment. The neural network compiling method includes: translating network files into intermediate expression files; The intermediate expression file is optimized; the optimized intermediate expression file is generated into a network template file based on the hardware interface; and the network template file is compiled into an executable reasoning application. The present invention aims to design and implement a compiling tool chain framework, intermediate representation and corresponding optimization algorithm that can automatically adjust parameters and generate codes according to software and hardware information, so that when computing on the target chip, the network output results will not be changed. Obtain higher calculation rate and smaller calculation delay in a short optimization time. And it is convenient for users to debug and adjust parameters by themselves.

Description

Technical field [0001] The present invention belongs to the field of neural network, involving a compilation method, particularly related to a compilation method, system, computer storage medium, and compilation device of a neural network. Background technique [0002] Today, the development of neural network has greatly promoted the development of machine learning and artificial intelligence and its related industries, such as face recognition, speech recognition, online translation, automatic driving and other technologies. However, due to the large network structure and computational amount of the neural network, the delay is the main obstacle to the large-scale investment in industrial production. Therefore, how to reduce the time delay, improve the calculation speed of the neural network, is an important issue for developing neural networks. [0003] In existing neural networks, optimization tools are compiled, using network files provided by users, directly generate executa...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06N3/08G06N3/04
CPCG06N3/08G06N3/045
Inventor 刘子汉冷静文陆冠东陈全李超过敏意
Owner SHANGHAI JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products