Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data distributed incremental learning method, system and device and storage medium

A technology of incremental learning and data distribution, applied in machine learning, computing models, computing, etc., can solve problems such as inability to handle complex scenarios, and achieve the effect of improving learning ability and strong practicability

Pending Publication Date: 2021-09-28
XI AN JIAOTONG UNIV
View PDF0 Cites 2 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

However, none of the existing machine learning schemes can handle such complex scenarios, thus posing a huge challenge to the realization of learning

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data distributed incremental learning method, system and device and storage medium
  • Data distributed incremental learning method, system and device and storage medium
  • Data distributed incremental learning method, system and device and storage medium

Examples

Experimental program
Comparison scheme
Effect test

Embodiment 1

[0076] refer to figure 1 and figure 2 , the data distributed incremental learning method of the present invention comprises the following steps:

[0077] 1) Determine the number of data distribution nodes and the number of incremental learning stages;

[0078] 2) Establish a training data set;

[0079] 3) Determine the category of each incremental learning stage, divide the training data set into T independent data sets, one of which corresponds to a data set in the incremental learning stage, and then in the current incremental learning stage, according to the corresponding The data collection of each data distribution node is established;

[0080] 4) Input the global shared model parameters of the previous incremental learning stage and the data collection of each data distribution node in the current incremental learning stage to each data distribution node, and then perform incremental learning training under the constraints of the incremental learning loss function, ...

Embodiment 2

[0130] Embodiment 2 A distributed incremental learning system for data, including:

[0131] A determination module is used to determine the number of data distribution nodes and the number of incremental learning stages;

[0132] Build a module for building a training data set;

[0133] The division module is used to determine the category of each incremental learning stage, divide the training data set into T independent data sets, and one incremental learning stage corresponds to a data set, and then in the current incremental learning stage, according to the incremental The data set corresponding to the learning phase establishes the data set of each data distribution node;

[0134] The model construction module is used to input the global shared model parameters of the last incremental learning stage and the data collection of each data distribution node in the current incremental learning stage to each data distribution node, and then increase the value under the constra...

Embodiment 3

[0141] A computer device, comprising a memory, a processor, and a computer program stored in the memory and operable on the processor, when the processor executes the computer program, the distributed incremental learning of the data is realized method steps.

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a data distribution type incremental learning method, system and device and a storage medium. The method comprises the following steps: determining the type of each incremental learning stage, and building a data set of each data distribution node; obtaining each data distribution point model; forming a shared data set; obtaining model parameters of each data distribution node; performing weighted aggregation on the model parameters of the data distribution nodes to obtain a preliminary global sharing model; integrating predicted output logit values obtained by calculating the M data distribution node models on the shared data set to obtain an integrated output logit value, and learning the integrated output logit value on the shared data set by the preliminary global shared model to obtain model parameters of the global shared model; the model parameters of the global sharing model are issued to all the data distribution nodes, the global sharing model on all the data distribution nodes is updated, and the method, the system, the equipment and the storage medium can effectively improve the learning ability of the model.

Description

technical field [0001] The invention belongs to the technical field of big data intelligent analysis, and relates to a data distributed incremental learning method, system, equipment and storage medium. Background technique [0002] Deep models have achieved great success in a wide range of research areas in artificial intelligence. However, they turned out to be prone to catastrophic forgetting problems. Catastrophic forgetting refers to the phenomenon that the performance of deep models on old data is severely degraded when model learning is performed on new data. Incremental learning aims to alleviate the model's forgetting of old data while learning new data, which has become an important research topic in deep learning. [0003] Current incremental learning frameworks require deep models to process continuous information flows in a centralized manner. Despite its success, we believe that such centralized setups are often impossible or impractical. More and more data...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06N20/00
CPCG06N20/00
Inventor 洪晓鹏张晓涵董松林龚怡宏
Owner XI AN JIAOTONG UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products