New Year Sale ! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

IAPP Exam AIGP Topic 7 Question 19 Discussion

Actual exam question for IAPP's AIGP exam
Question #: 19
Topic #: 7
[All AIGP Questions]

A company initially intended to use a large data set containing personal information to train an Al model. After consideration, the company determined that it can derive enough value from the data set without any personal information and permanently obfuscated all personal data elements before training the model.

This is an example of applying which privacy-enhancing technique (PET)?

Show Suggested Answer Hide Answer
Suggested Answer: A

Anonymization is a privacy-enhancing technique that involves removing or permanently altering personal data elements to prevent the identification of individuals. In this case, the company obfuscated all personal data elements before training the model, which aligns with the definition of anonymization. This ensures that the data cannot be traced back to individuals, thereby protecting their privacy while still allowing the company to derive value from the dataset. Reference: AIGP Body of Knowledge, privacy-enhancing techniques section.


Contribute your Thoughts:

Gilma
10 days ago
Haha, I bet the data scientists had a field day with 'obfuscating' all that personal info. Sounds like they were really 'anonymizing' the heck out of that data!
upvoted 0 times
...
Judy
12 days ago
I'm not sure, but I think it could also be B) Pseudonymization because the data was obfuscated.
upvoted 0 times
...
Jamika
19 days ago
I agree with Lorriane, anonymization is the best choice to protect personal data.
upvoted 0 times
...
Huey
19 days ago
Hmm, I'm not so sure. This sounds more like differential privacy to me, where they added noise to the data to preserve privacy while still deriving value.
upvoted 0 times
Melodie
9 days ago
A) Anonymization.
upvoted 0 times
...
...
Gwenn
22 days ago
I'd say pseudonymization. Obfuscating the personal data elements but still keeping some linkability sounds like what they did here.
upvoted 0 times
Dominga
4 days ago
C) Differential privacy.
upvoted 0 times
...
Stephaine
7 days ago
B) Pseudonymization.
upvoted 0 times
...
Lamar
9 days ago
A) Anonymization.
upvoted 0 times
...
...
Lorriane
26 days ago
I think the answer is A) Anonymization.
upvoted 0 times
...
Goldie
30 days ago
Definitely anonymization. Removing all personal information from the data set before training the model is the textbook definition of anonymization.
upvoted 0 times
Amber
11 days ago
Definitely anonymization.
upvoted 0 times
...
Celeste
13 days ago
C) Differential privacy.
upvoted 0 times
...
Amie
15 days ago
B) Pseudonymization.
upvoted 0 times
...
Trina
17 days ago
A) Anonymization.
upvoted 0 times
...
...

Save Cancel