Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Exam Professional Cloud Architect Topic 8 Question 80 Discussion

Actual exam question for Google's Professional Cloud Architect exam
Question #: 80
Topic #: 8
[All Professional Cloud Architect Questions]

For this question, refer to the TerramEarth case study. To be compliant with European GDPR regulation, TerramEarth is required to delete data generated from its European customers after a period of 36 months when it contains personal dat

a. In the new architecture, this data will be stored in both Cloud Storage and BigQuery. What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: B

The practice for managing logs generated on Compute Engine on Google Cloud is to install the Cloud Logging agent and send them to Cloud Logging.

The sent logs will be aggregated into a Cloud Logging sink and exported to Cloud Storage.

The reason for using Cloud Storage as the destination for the logs is that the requirement in question requires setting up a lifecycle based on the storage period.

In this case, the log will be used for active queries for 30 days after it is saved, but after that, it needs to be stored for a longer period of time for auditing purposes.

If the data is to be used for active queries, we can use BigQuery's Cloud Storage data query feature and move the data past 30 days to Coldline to build a cost-optimal solution.

Therefore, the correct answer is as follows

1. Install the Cloud Logging agent on all instances.

Create a sync that exports the logs to the region's Cloud Storage bucket.

3. Create an Object Lifecycle rule to move the files to the Coldline Cloud Storage bucket after one month. 4.

4. set up a bucket-level retention policy using bucket locking.'


Contribute your Thoughts:

Lottie
5 hours ago
I'm not sure, but option A seems to be the most logical choice based on the requirements. It's important to follow the regulations and ensure data is deleted after 36 months.
upvoted 0 times
...
Tyisha
1 days ago
I agree with Novella. It makes sense to store the data in both BigQuery and Cloud Storage with the specified retention period to comply with GDPR.
upvoted 0 times
...
Novella
6 days ago
I think the answer is A) Create a BigQuery table for the European data, and set the table retention period to 36 months. For Cloud Storage, use gsutil to enable lifecycle management using a DELETE action with an Age condition of 36 months.
upvoted 0 times
...

Save Cancel