Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data processing method and device, AI chip, electronic equipment and storage medium

A data processing and chip technology, applied in the field of neural networks, can solve problems such as not obvious benefits, not supporting sparse network compression and decompression operations, not suitable for current network scenarios, etc., and achieve the effect of improving efficiency and good hardware performance

Active Publication Date: 2022-07-08
成都登临科技有限公司 +1
View PDF18 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, the problem that this method may face is that the dedicated chip may not support the compression and decompression operations of the sparse network; even if the chip supports the sparse network, the compression and decompression scheme used may not be suitable for the current network scenario. make the gains not obvious

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data processing method and device, AI chip, electronic equipment and storage medium
  • Data processing method and device, AI chip, electronic equipment and storage medium
  • Data processing method and device, AI chip, electronic equipment and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach

[0068] In an optional implementation manner, when using the optimal compression algorithm to compress the relevant data of the neural network, the process may be: dividing the relevant data of the neural network into blocks according to the format required by the hardware; Each data block is aligned according to the alignment requirements required by the hardware; the optimal compression algorithm is used to compress each aligned data block according to the alignment requirements required by the hardware to obtain the corresponding compressed data and corresponding data index to ensure that the compressed data conforms to the alignment requirements required by the hardware.

[0069] Since the relevant data (uncompressed data) of the neural network may be large, such as 100M, if the computing unit can only complete 1M task calculations at a time. At this time, the hardware needs to repeat 100 times to load the data into the computing unit for calculation. The data loaded each ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention relates to a data processing method and device, an AI chip, electronic equipment and a storage medium, and belongs to the technical field of neural networks. The data processing method comprises the following steps: acquiring basic information of a neural network deployed in an AI chip and basic information of the AI chip; selecting an optimal compression algorithm from a plurality of preset compression algorithms according to the basic information of the neural network and the basic information of the AI chip; the related data of the neural network are compressed through an optimal compression algorithm, compressed data and corresponding data indexes are obtained, and the data indexes are used for reducing the compressed data into original data before compression or used for determining the positions of non-zero elements in the compressed data in the original data before compression. By combining the specific conditions of the neural network and the AI chip, the compression scheme suitable for the current scene is flexibly selected, acceleration of the sparse neural network is completed in a relatively optimal mode, and the energy consumption ratio and the performance are improved.

Description

technical field [0001] The present application belongs to the technical field of neural networks, and specifically relates to a data processing method, an apparatus, an AI chip, an electronic device and a storage medium. Background technique [0002] With the advent of the era of artificial intelligence (AI), intelligent tasks such as image recognition, speech recognition, and natural language processing are ubiquitous in life. Neural network, as one of the most effective algorithms to achieve this kind of intelligent task, has received extensive attention and application. However, a large neural network has a large number of layers and nodes, resulting in a large number of weight parameters, time-consuming network training process, and a large storage space for the trained model. Therefore, in the field of artificial intelligence, more and more attention has been paid to sparse neural networks, and many optimization methods have been proposed to obtain greater benefits com...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/063G06N3/04G06N3/08G06F16/174
CPCG06N3/063G06N3/08G06F16/1744G06N3/045Y02D10/00
Inventor 段茗
Owner 成都登临科技有限公司
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products