Looking for breakthrough ideas for innovation challenges? Try Patsnap Eureka!

Data protection method, device and apparatus, and computer storage medium

A data protection, computer program technology, applied in digital data protection, computer security devices, computing, etc., can solve problems such as inflexible user privacy data protection methods

Pending Publication Date: 2020-04-07
ZTE CORP
View PDF5 Cites 1 Cited by
  • Summary
  • Abstract
  • Description
  • Claims
  • Application Information

AI Technical Summary

Problems solved by technology

[0003] In related technologies, the protection method for user privacy data is not flexible enough, and it is impossible to determine whether privacy protection for user data is required according to actual needs

Method used

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
View more

Image

Smart Image Click on the blue labels to locate them in the text.
Viewing Examples
Smart Image
  • Data protection method, device and apparatus, and computer storage medium
  • Data protection method, device and apparatus, and computer storage medium
  • Data protection method, device and apparatus, and computer storage medium

Examples

Experimental program
Comparison scheme
Effect test

no. 1 example

[0031] The first embodiment of the present invention describes a data protection method, figure 1 It is a flowchart of a data protection method in an embodiment of the present invention, such as figure 1 As shown, the process can include:

[0032] Step 101: Obtain n privacy sub-models; wherein, each privacy sub-model is a data set representing a privacy attribute, and the privacy attributes represented by the n privacy sub-models are different from each other, and n is an integer greater than 1.

[0033] For the implementation of this step, for example, training data may be obtained first, and the training data is used to represent user data generated when the application is running; then, the training The data is clustered to obtain n privacy sub-models.

[0034] In actual implementation, the original user data generated when the application is running can be obtained, and the above-mentioned original user data can be preprocessed to obtain training data; for example, at le...

no. 2 example

[0061] On the basis of the data protection methods proposed in the foregoing embodiments of the present invention, further illustrations are given.

[0062] image 3 It is a flowchart of another data protection method according to an embodiment of the present invention, such as image 3 As shown, the process can include:

[0063] Step 301: Obtain data to be processed and n privacy sub-models.

[0064] The implementation of this step has been described in the first embodiment, and will not be repeated here.

[0065] Step 302: Determine the privacy sub-model corresponding to the data to be processed.

[0066] The implementation of this step has been described in step 102, and will not be repeated here.

[0067] Step 303: Determine whether the correlation between the data to be processed and the corresponding privacy sub-model is greater than or equal to a preset correlation threshold, if yes, execute step 304, and if not, end the process.

[0068] Step 304: Carry out early ...

no. 3 example

[0072] On the basis of the data protection methods provided in the foregoing embodiments, a fourth embodiment of the present invention provides a data protection device.

[0073] Figure 4 It is a schematic diagram of the composition and structure of a data protection device according to an embodiment of the present invention, such as Figure 4 As shown, the device includes an acquisition module 401 and a decision module 402, wherein,

[0074]The obtaining module 401 is used to obtain n privacy sub-models; wherein, each privacy sub-model is a data set representing a privacy attribute, and the privacy attributes represented by the n privacy sub-models are different from each other, and n is greater than 1 integer;

[0075] A decision-making module 402, configured to acquire data to be processed, and determine a privacy sub-model corresponding to the data to be processed; when the correlation between the data to be processed and the corresponding privacy sub-model is greater t...

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

PUM

No PUM Login to View More

Abstract

Embodiments of the invention provide a data protection method, device and apparatus, and a computer storage medium. The method comprises the steps of obtaining n privacy sub-models; wherein each privacy sub-model is a data set representing one privacy attribute, the privacy attributes represented by the n privacy sub-models are different from one another, and n is an integer greater than 1; obtaining to-be-processed data, and determining a privacy sub-model corresponding to the to-be-processed data; and when the correlation between the to-be-processed data and the corresponding privacy sub-model is greater than or equal to a preset correlation threshold, generating early warning information to prompt that the to-be-processed data needs to be subjected to privacy protection.

Description

technical field [0001] Embodiments of the present invention relate to but are not limited to privacy data protection technologies, and in particular, relate to a data protection method, device, device, and computer storage medium. Background technique [0002] With the rapid development of the mobile Internet, various applications on mobile terminals have become an important tool for users to understand the world; due to the openness and interoperability of the Internet, users are paying more and more attention to personal online privacy. Although user privacy is sensitive information, it is still exposed at any time. For example, various user behavior trajectories such as searching, browsing, downloading, payment, location, and exercise are collected, stored, and analyzed by various websites, apps, and terminals. , and then used for precise marketing or other commercial purposes, and even bring about information leakage, identity theft, malicious attacks and other hazards. ...

Claims

the structure of the environmentally friendly knitted fabric provided by the present invention; figure 2 Flow chart of the yarn wrapping machine for environmentally friendly knitted fabrics and storage devices; image 3 Is the parameter map of the yarn covering machine
Login to View More

Application Information

Patent Timeline
no application Login to View More
Patent Type & Authority Applications(China)
IPC IPC(8): G06F21/62
CPCG06F21/6245G06F21/62
Inventor 艾东梅
Owner ZTE CORP
Who we serve
  • R&D Engineer
  • R&D Manager
  • IP Professional
Why Patsnap Eureka
  • Industry Leading Data Capabilities
  • Powerful AI technology
  • Patent DNA Extraction
Social media
Patsnap Eureka Blog
Learn More
PatSnap group products