Neural network processing method and device, storage medium, and electronic apparatus

A neural network and processing method technology, applied in the fields of storage media, electronic devices, neural network processing methods and devices, can solve the problems of slow convergence, low precision, and no solution found, so as to reduce errors and solve convergence problems. slow effect

Inactive Publication Date: 2019-01-11
ENNEW DIGITAL TECH CO LTD
View PDF3 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] figure 1 is a schematic diagram of the weight Gaussian distribution of the convolutional neural network in the prior art of the present invention, such as figure 1 As shown, the weight w of each layer of the convolutional neural network l The distribution of is generally around 0, which is close to the Gaussian distribution, so if the binarization operation is directly performed on the weight, it will lead to the binarized weight and the original weight w l The difference is large, resulting in oscillation when using the stochastic gradient descent algorithm to optimize the convolutional neural network, resulting in slower convergence and lower accuracy
In the same way, the output activation value of each layer of the convolutional neural network is also similar to the Gaussian distribution. If binary quantization is forcibly performed, it will cause a large difference in the value before and after quantization.
[0004] Aiming at the above problems existing in the prior art, no effective solution has been found yet

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Neural network processing method and device, storage medium, and electronic apparatus
  • Neural network processing method and device, storage medium, and electronic apparatus
  • Neural network processing method and device, storage medium, and electronic apparatus

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0030] The method embodiment provided in Embodiment 1 of the present application may be executed in a server, a network terminal, a computer terminal or a similar computing device. Take running on a network terminal as an example, figure 2 It is a hardware structural block diagram of a network terminal of a neural network training method in an embodiment of the present invention. Such as figure 2 As shown, the network terminal 10 may include one or more ( figure 2 Only one is shown in the figure) a processor 102 (the processor 102 may include but not limited to a processing device such as a microprocessor MCU or a programmable logic device FPGA) and a memory 104 for storing data. Optionally, the above-mentioned network terminal also A transmission device 106 for communication functions as well as input and output devices 108 may be included. Those of ordinary skill in the art can understand that, figure 2 The shown structure is only for illustration, and does not limit...

Embodiment 2

[0060] This embodiment also provides a neural network training device, which is used to implement the above embodiments and preferred implementation modes, and what has already been described will not be repeated. As used below, the term "module" may be a combination of software and / or hardware that realizes a predetermined function. Although the devices described in the following embodiments are preferably implemented in software, implementations in hardware, or a combination of software and hardware are also possible and contemplated.

[0061] Figure 4 is a structural block diagram of the training process of the neural network according to an embodiment of the present invention, such as Figure 4 As shown, the device includes:

[0062] The determination module 40 is used to determine the objective function in the convolutional neural network, wherein each layer of the objective function corresponds to an initial network weight, wherein the convolutional neural network is ...

Embodiment 3

[0071] An embodiment of the present invention also provides a storage medium, in which a computer program is stored, wherein the computer program is set to execute the steps in any one of the above method embodiments when running.

[0072] Optionally, in this embodiment, the above-mentioned storage medium may be configured to store a computer program for performing the following steps:

[0073] S1, determining the objective function in the convolutional neural network, wherein each layer of the objective function corresponds to an initial network weight, wherein the convolutional neural network is applied to at least one of the following: face recognition, vehicle detection, object identification;

[0074] S2, process the initial network weights according to the preset regular constraint coefficients to obtain the target network weights;

[0075] S3, using the target network weights and input values ​​in the objective function to obtain an output value.

[0076] Optionally, ...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a neural network processing method and device, a storage medium, and an electronic apparatus, wherein the method comprises: determining an objective function in a convolution neural network, wherein each layer of the objective function corresponds to an initial network weight value, and the convolution neural network is applied to at least one of face recognition, vehicle detection, and object recognition; processing the initial network weights according to preset regular constraint coefficients to obtain target network weights; using the target network weights and input values to an output value in the objective function. The invention solves the technical problems of slow convergence and low precision of convolution neural network in the prior art.

Description

technical field [0001] The present invention relates to the communication field, in particular, to a neural network processing method and device, a storage medium, and an electronic device. Background technique [0002] In the convolutional neural network in the prior art, as the number of layers of the convolutional neural network deepens, the complexity of the network becomes larger and larger. For example, the number of convolutional layers of the popular resnet neural network can exceed 1000 In addition, the calculation of all convolutional layers accounts for almost 80% of the calculation of the entire network. As a result, similar convolutional neural networks cannot be run on embedded devices such as surveillance cameras. In order to reduce the computational complexity of the convolutional layer, existing techniques operate by directly binarizing the floating-point weights and floating-point activations of the neural network. [0003] figure 1 is a schematic diagra...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/04
CPCG06N3/045
Inventor 陈江林
Owner ENNEW DIGITAL TECH CO LTD
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products