Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

A privacy protection method and system for big data publishing

A privacy protection and big data technology, applied in the field of information security, can solve the problems of privacy leakage, difficulty in meeting data privacy protection requirements, and large limitations, and achieve the effect of meeting availability requirements

Active Publication Date: 2020-06-05
BEIJING INFORMATION SCI & TECH UNIV
View PDF8 Cites 0 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0004] Although the above method achieves data publishing privacy protection to a certain extent, it also has certain defects.
For example, the k-anonymity algorithm does not impose any constraints on sensitive attribute data. When all records in a group have the same sensitive attribute value, the sensitive information can be uniquely determined, so the attacker can easily obtain private information, that is, the k-anonymity algorithm does not affect the data Smaller loss and alteration, but less privacy protection
The l-diversity algorithm guarantees that there are more than or equal to l different sensitive attribute values ​​in each equivalence class, but when a certain class value in the group accounts for a large proportion, it is very likely to infer that the value is sensitive information. It will lead to privacy leakage, that is, the privacy protection degree of the l-diversity algorithm is higher than that of the k-anonymity algorithm, but the information loss caused by the data is also higher than that of the k-anonymity algorithm
The t-closeness algorithm requires the distribution of sensitive attribute values ​​in the group to be similar to the distribution in the data table, which solves the problems of the l-diversity algorithm. However, due to the strict privacy protection requirements of t-closeness, the data quality is comparable to that of the other two methods. It is more difficult to meet the requirements of users, and the limitations are larger, so the data satisfying the t-closeness constraint is difficult to be used in data mining, data analysis and other applications, but because of its strongest degree of privacy protection, it is suitable for applications with higher risks. Data release scenario
[0005] It can be seen that the above three anonymization methods have their own advantages and limitations, and the privacy parameters of each anonymization method also affect the effect of data privacy protection and data quality
In the actual environment, users will put forward different privacy protection requirements for data according to the purpose of using data, and the sensitivity of data types is also different. It is difficult to meet the data of various data purposes with only one anonymization method. privacy protection needs
Therefore, according to different privacy protection requirements, how to scientifically and reasonably select the most appropriate method, and automatically find the optimal parameters to ensure the effect of data privacy protection has no practical and applicable results.

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • A privacy protection method and system for big data publishing
  • A privacy protection method and system for big data publishing
  • A privacy protection method and system for big data publishing

Examples

Experimental program
Comparison scheme
Effect test

Embodiment Construction

[0062] The following will clearly and completely describe the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some, not all, embodiments of the present invention. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.

[0063] The purpose of the present invention is to provide a privacy protection method and system for big data publishing, which can select the most suitable anonymization method and privacy parameters, so that the processed data can achieve the privacy protection effect desired by the data provider, and Ability to meet the usability requirements of data users.

[0064] In order to make the above objects, features and advantages of the present invention more co...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

The invention discloses a privacy protection method and system for big data release. The method comprises the following steps of firstly retrieving data according to a user data demand range; then determining a user security level according to the user identity and the data application, and determining a corresponding anonymization scheme and an initial privacy parameter according to the user security level; and then carrying out the privacy protection effect evaluation according to the privacy protection requirement of the data provider and the data quality requirement of the user party; if not, adjusting parameters; and if the parameter adjustment is invalid, performing the scheme adjustment. After each time of adjustment is completed, the privacy protection effect evaluation needs to becarried out again, and after the evaluation is passed, the privacy protection processing is carried on the retrieved to-be-published data according to the selected anonymization method and parametersto form final publishable data. By applying the method, the most appropriate anonymization method and privacy parameters can be selected, so that the processed data not only can achieve a privacy protection effect expected by a data provider, but also can meet the requirement of a user for data availability.

Description

technical field [0001] The invention relates to the technical field of information security, in particular to a privacy protection method and system for big data publishing. Background technique [0002] In the field of data trading and data sharing, how to provide data to those who need it without revealing their privacy has become a thorny issue. In order to solve this problem, the industry has proposed many data publishing privacy protection technologies. Common privacy protection technologies are roughly divided into the following categories according to different implementation methods: data conversion methods, data anonymization methods, secure multi-party computing methods, and hybrid methods. Among them, the anonymization method has been more widely used because of its remarkable security and effectiveness. [0003] The most famous algorithm among anonymization methods is the k-anonymity algorithm. In 1998, Sweeney et al. first proposed the k-anonymity algorithm, ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Patents(China)
IPC IPC(8): G06F21/62
CPCG06F21/6227G06F21/6254
Inventor 徐雅斌
Owner BEIJING INFORMATION SCI & TECH UNIV
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products