Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Neural network computing method and device, mobile terminal and storage medium

A neural network algorithm and neural network technology, applied in biological neural network models, physical realization, etc., can solve the problem of long neural network reasoning time

Active Publication Date: 2019-06-18
GUANGDONG OPPO MOBILE TELECOMM CORP LTD
View PDF8 Cites 16 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

As the neural network becomes more and more complex, the number of operators increases, which will also lead to longer inference time of the neural network.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network computing method and device, mobile terminal and storage medium
  • Neural network computing method and device, mobile terminal and storage medium
  • Neural network computing method and device, mobile terminal and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0021] In order to enable those skilled in the art to better understand the solution of the present invention, the technical solution in the embodiment of the application will be clearly and completely described below in conjunction with the accompanying drawings in the embodiment of the application. Obviously, the described embodiment is only It is a part of embodiments of the present invention, but not all embodiments. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of the present invention.

[0022] The terms "first", "second" and the like in the description and claims of the present invention and the above drawings are used to distinguish different objects, rather than to describe a specific order. Furthermore, the terms "include" and "have", as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a pr...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The embodiment of the invention discloses a neural network calculation method and device, a mobile terminal and a storage medium, and the method comprises the steps: obtaining M to-be-executed operators, calculating the dependency relationship among the M to-be-executed operators, and enabling N to be an integer greater than or equal to 2; cutting the M operators to be executed according to the dependency relationship among the M operators to be executed to obtain N operator sets, wherein each operator set in the N operator sets at least comprises one operator, and N is an integer larger thanor equal to 2; And if the N operator sets are mutually independent operator sets, starting N threads to calculate operators in the N operator sets respectively. According to the embodiment of the invention, the reasoning time of the neural network can be reduced.

Description

technical field [0001] The present application relates to the field of communication technology, and in particular to a neural network calculation method, device, mobile terminal and storage medium. Background technique [0002] In the current neural network algorithm framework (for example, Tensorflow Lite), when performing neural network calculations, all operators that need to be executed are added to a queue to be executed, and then the processor calls and executes these operators in turn, that is, in These operators are executed sequentially in one thread. As the neural network becomes more complex and the number of operators increases, the inference time of the neural network will also become longer. Contents of the invention [0003] Embodiments of the present application provide a neural network calculation method, device, mobile terminal, and storage medium, which can reduce the inference time of the neural network. [0004] In the first aspect, the embodiment o...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/063
CPCG06N3/063Y02D10/00
Inventor 刘耀勇陈岩
Owner GUANGDONG OPPO MOBILE TELECOMM CORP LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products