Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Information processing device, information processing method, and information processing program

An information processing device and program technology, applied in the field of neural networks, can solve problems such as inability to perform real-time neural network actions

Pending Publication Date: 2021-09-10
MITSUBISHI ELECTRIC CORP
View PDF1 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

Therefore, when the neural network is directly installed on a device with limited resources such as an embedded device, it is impossible to operate the neural network in real time.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Information processing device, information processing method, and information processing program
  • Information processing device, information processing method, and information processing program
  • Information processing device, information processing method, and information processing program

Examples

Experimental program
Comparison scheme
Effect test

Embodiment approach 1

[0033] ***summary***

[0034] In this embodiment, weight reduction of the neural network when the neural network is installed in a device with limited resources such as an embedded device will be described.

[0035] More specifically, in this embodiment, the layer with the largest amount of computation is extracted among the layers of the neural network. Then, the computation load of the extracted layers is reduced to meet the required processing performance. In addition, relearning is performed after reducing the amount of computation, thereby suppressing a decrease in the recognition rate.

[0036] By repeatedly executing the above steps, according to the present embodiment, it is possible to obtain a neural network with a small amount of computation that can be installed in devices with limited resources.

[0037] ***step***

[0038] Next, the procedure for reducing the weight of the neural network according to the present embodiment will be described with reference to t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

This processing performance calculation unit (101) calculates the embedded device processing performance in the case of implementing a neural network having multiple layers. A request fulfillment determination unit (102) determines whether or not the embedded device processing performance in the case of implementing a neural network satisfies a requested processing performance. In the case that the request fulfillment determination unit (102) determines that the embedded device processing performance in the case of implementing a neural network does not satisfy the requested processing performance, a reduction layer indicating unit (103) indicates, on the basis of the processing amounts of each layer of the neural network, a reduction layer, which is a layer for reducing the processing amount, from among the multiple layers.

Description

technical field [0001] The present invention relates to neural networks. Background technique [0002] In a neural network (hereinafter simply referred to as a network), a large-scale calculation is required. Therefore, when the neural network is directly installed in a device with limited resources such as an embedded device, it is impossible to operate the neural network in real time. In order to operate a neural network in real time on a device with limited resources, it is necessary to reduce the weight of the neural network. [0003] Patent Document 1 discloses a structure for increasing the inference processing speed of a neural network. [0004] Patent Document 1 discloses a configuration for reducing the amount of product-sum calculations in inference processing by reducing the dimension of the weight matrix. More specifically, Patent Document 1 discloses a structure in which, in order to minimize the reduction in recognition accuracy caused by reducing the amount...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08G06N3/10
CPCG06N3/082G06N3/04
Inventor 冈田尚也
Owner MITSUBISHI ELECTRIC CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products