Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Federal learning method based on dynamic adjustment model aggregation weight

A technology of dynamic adjustment and learning methods, applied in computing models, machine learning, computing, etc., can solve problems such as inability to provide universality and good solutions to heterogeneous data, so as to improve the quality of the global model, rationally aggregate weight distribution, and accelerate The effect of convergence speed

Pending Publication Date: 2021-07-13
HANGZHOU DIANZI UNIV
View PDF0 Cites 11 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0005] Aiming at the problem that existing federated learning methods cannot provide heterogeneous data solutions with good versatility, the present invention starts with model training information, mines client data and model features without revealing user privacy, and integrates model accuracy, data Quantitative representation of training information in three dimensions of quality and model difference, and three quantitative indicators are used to jointly model, and on this basis, the aggregation weight is dynamically set for the client to improve the accuracy and convergence speed of the model in non-IID data scenarios

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Federal learning method based on dynamic adjustment model aggregation weight
  • Federal learning method based on dynamic adjustment model aggregation weight

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0017] It should be understood that all combinations of the foregoing concepts as well as additional concepts described in more detail below may be considered part of the inventive subject matter, provided such concepts are not mutually inconsistent. Additionally, all combinations of claimed subject matter are considered part of the inventive subject matter.

[0018] The foregoing and other aspects, embodiments and features of the present teachings can be more fully understood from the following description when taken in conjunction with the accompanying drawings. Other additional aspects of the invention, such as the features and / or advantages of the exemplary embodiments, will be apparent from the description below, or learned by practice of specific embodiments in accordance with the teachings of the invention.

[0019] The federated learning method based on dynamically adjusting model aggregation weights of the present invention is applicable to federated learning includin...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a federal learning method based on a dynamic adjustment model aggregation weight, and the method comprises the steps that: a cloud server receives a local training model and a data quality index from each client, and if a client weight updating condition is reached, a contribution score for each client is calculated according to the contribution of the data quality, the model precision and the model difference index to the model training precision, and weighted average is performed to generate a global model; and the cloud server issues the updated global model to the clients, each client carries out model training on the local training data after receiving the global model, and uploads the local model and the data quality index to the cloud server again after training is finished. According to the method, client training data such as data distribution, model precision and model difference are reasonably utilized, the dynamic aggregation weight is generated, available features in the client training process are fully mined, and a global model with higher quality is formed, so that the model precision and convergence efficiency are improved.

Description

technical field [0001] The invention relates to the field of distributed machine learning, in particular to a federated learning method based on dynamically adjusting model aggregation weights. Background technique [0002] As digital technology enters a period of rapid development, technologies such as big data and artificial intelligence usher in explosive development. On the one hand, this brings new opportunities for upgrading and transformation of traditional business formats; on the other hand, it inevitably brings new challenges to data and network security. a whole new challenge. In order to ensure data security and privacy, various companies are unwilling to share data, which makes the data of each enterprise can only be processed locally, resulting in the problem of data islands, and data islands prevent companies and researchers from repeatedly analyzing and mining data characteristics, hindering The development and application of big data and artificial intellig...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): G06N20/00
CPCG06N20/00
Inventor 牟元凯曾艳袁俊峰万健张纪林
Owner HANGZHOU DIANZI UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products