You have a dataset that includes confidential dat
a. You use the dataset to train a model.
You must use a differential privacy parameter to keep the data of individuals safe and private.
You need to reduce the effect of user data on aggregated results.
What should you do?
Differential privacy tries to protect against the possibility that a user can produce an indefinite number of reports to eventually reveal sensitive data. A value known as epsilon measures how noisy, or private, a report is. Epsilon has an inverse relationship to noise or privacy. The lower the epsilon, the more noisy (and private) the data is.
https://docs.microsoft.com/en-us/azure/machine-learning/concept-differential-privacy
Franchesca
5 days agoSkye
15 days agoStaci
18 days agoDalene
19 days ago