Data training method and device, storage medium and electronic device

A technology of data training and training models, applied in the field of artificial intelligence, can solve problems such as difficult to implement, consume huge computing power, increase the input of network resources, etc., and achieve the effect of improving speed and solving low efficiency

Pending Publication Date: 2019-07-09
ZTE CORP
View PDF6 Cites 20 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0002] In related technologies, the training of Deep Learning Models requires a huge amount of computing power, and it often takes days or even months to complete a training session.
Therefore, in order to speed up the training of deep learning models, it is usually achieved by adding processing equipment and optimizing the training model, but the first method will increase the investment of network resources, and the second method is difficult to achieve in a short time
[0003] For the above-mentioned problems existing in related technologies, no effective solution has been found yet

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data training method and device, storage medium and electronic device
  • Data training method and device, storage medium and electronic device
  • Data training method and device, storage medium and electronic device

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0021] In this embodiment, a data training method is provided, figure 1 is a flowchart of a data training method according to an embodiment of the present invention, such as figure 1 As shown, the process includes the following steps:

[0022] Step S102, determining sample data and available cluster resources;

[0023] Step S104, splitting the total training model into multiple sub-models;

[0024] Step S106, using cluster resources to train sample data in parallel on multiple sub-models.

[0025] Through the above steps, by splitting the total training model into multiple sub-models, and then training sample data in parallel on multiple sub-models, the technical problem of low efficiency of training sample data in related technologies is solved, without increasing network resources. , improving the speed of training sample data.

[0026] Optionally, the execution subject of the above steps may be a server, a data processing system, a cluster platform, etc., but is not lim...

Embodiment 2

[0042] In this embodiment, a data training device is also provided, which is used to implement the above embodiments and preferred implementation modes, and what has been explained will not be repeated here. As used below, the term "module" may be a combination of software and / or hardware that realizes a predetermined function. Although the devices described in the following embodiments are preferably implemented in software, implementations in hardware, or a combination of software and hardware are also possible and contemplated.

[0043] figure 2 is a structural block diagram of a data training device according to an embodiment of the present invention, such as figure 2 As shown, the device includes:

[0044] Determining module 20, configured to determine sample data and available cluster resources;

[0045] Split module 22, for splitting the total training model into multiple sub-models;

[0046] The training module 24 is configured to use cluster resources to train s...

Embodiment 3

[0050] This embodiment is an optional embodiment of the application, which is used to explain and illustrate the application in detail in combination with specific model examples:

[0051] In order to speed up the training of the deep learning model, a parallel computing method can be used, that is, a training is divided into multiple sub-parts, and each sub-part completes the calculation on different computing devices at the same time, so as to achieve the purpose of speeding up the training. In deep learning parallel computing, there are two types of parallel algorithms, data parallel and model parallel, and it is necessary to select an appropriate parallel algorithm according to the characteristics of the model and computing cluster.

[0052] This embodiment is such a method and system: it can select an appropriate parallel algorithm according to the characteristics of the deep learning model and the characteristics of the high-performance cluster, and automatically transfor...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

PUM

No PUM Login to view more

Abstract

The invention provides a data training method and device, a storage medium and an electronic device, and the method comprises the steps: determining sample data and available cluster resources; splitting the total training model into a plurality of sub-models; and training the sample data in parallel on the plurality of sub-models by using the cluster resources. According to the invention, the technical problem of too low training sample data efficiency in the prior art is solved.

Description

technical field [0001] The present invention relates to the field of artificial intelligence, in particular to a data training method and device, a storage medium, and an electronic device. Background technique [0002] In related technologies, the training of Deep Learning Models requires huge computing power, and it often takes days or even months to complete a training session. Therefore, in order to speed up the training of deep learning models, it is usually achieved by adding processing equipment and optimizing the training model, but the first method will increase the investment of network resources, and the second method is difficult to achieve in a short time. [0003] Aiming at the above-mentioned problems existing in related technologies, no effective solution has been found yet. Contents of the invention [0004] Embodiments of the present invention provide a data training method and device, a storage medium, and an electronic device. [0005] According to an...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to view more

Application Information

Patent Timeline
no application Login to view more
Patent Type & Authority Applications(China)
IPC IPC(8): G06N3/08G06N3/04
CPCG06N3/08G06N3/045G06N3/10
Inventor 韩炳涛
Owner ZTE CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Try Eureka
PatSnap group products