Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Federal learning privacy protection method based on homomorphic encryption and secret sharing

A technology of secret sharing and homomorphic encryption, which is applied in the direction of homomorphic encryption communication, neural learning methods, digital data protection, etc., to achieve the effect of preventing easy recovery

Active Publication Date: 2021-06-25
BEIJING UNIV OF TECH
View PDF6 Cites 22 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004]Aiming at the privacy leakage problem existing in the existing federated learning, this invention proposes a federated learning privacy protection method based on homomorphic encryption and secret sharing, and the application is based on homomorphic The encrypted gradient protection method and the random number protection method based on secret sharing ensure the security of the training data of the participants and prevent the leakage of the private information of the training data of the participants

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Federal learning privacy protection method based on homomorphic encryption and secret sharing
  • Federal learning privacy protection method based on homomorphic encryption and secret sharing
  • Federal learning privacy protection method based on homomorphic encryption and secret sharing

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0024] The present invention will be further described below in conjunction with the accompanying drawings and specific embodiments.

[0025] The specific implementation process of the federated learning privacy protection method based on homomorphic encryption and secret sharing in the present invention is as follows: figure 1 shown, including the following steps:

[0026] Step 1: Initialization phase.

[0027] Participants complete the initialization of various parameters locally, including model parameters, key pairs, random numbers and shares.

[0028] Step 1.1: Initialization of model parameters.

[0029] (1) The participant initializes the neural network model nn locally, the learning rate α and the number of training rounds epoch, and the nn, α and epoch of each participant are the same.

[0030] Step 1.2: Initialization of the key pair.

[0031] (1) The key generation server completes the generation of the public key pk and private key sk, and distributes them to e...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a federal learning privacy protection method based on homomorphic encryption and secret sharing, wherein the method comprises two parts: a gradient protection method based on homomorphic encryption and a random number protection method based on secret sharing, and mainly aims at four stages: an initialization stage, a model training stage, a model aggregation stage and a model updating stage. The gradient protection method based on homomorphic encryption is adopted, gradient protection is achieved, meanwhile, aggregation of gradient ciphertexts can be completed, leakage of gradient privacy information is effectively prevented, and safe aggregation of the gradient can be achieved; through the random number protection method based on secret sharing, the gradient ciphertext is protected, and meanwhile, the random number protected by the gradient ciphertext is protected, so that collusion attacks among the aggregation server, the key generation server and the participants can be effectively prevented, and the security of gradient information in the interaction process of the aggregation server, the key generation server and the participants is further guaranteed.

Description

technical field [0001] The invention belongs to the field of federated learning security technology, and specifically relates to a federated learning privacy protection method based on homomorphic encryption and secret sharing. The method proposes two technologies, a gradient protection method based on homomorphic encryption and a random number protection method based on secret sharing. It can not only ensure the security of the gradient uploaded by the participants, but also prevent the gradient information of the participants from leaking to the key distributor. Background technique [0002] As a branch of artificial intelligence, deep learning requires a sufficient amount of data for training. However, due to privacy issues, this condition is often not met. For example, in the medical field, since medical data is often very sensitive, it usually contains personal privacy. Information, in the process of data sharing in multiple medical centers, will lead to the disclosure ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
IPC IPC(8): H04L9/00H04L9/08G06F21/60G06F21/62G06N3/02G06N3/08
CPCH04L9/008H04L9/085H04L9/0869G06F21/602G06F21/6245G06N3/02G06N3/08
Inventor 林莉张笑盈
Owner BEIJING UNIV OF TECH
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products