For this question, refer to the TerramEarth case study. To be compliant with European GDPR regulation, TerramEarth is required to delete data generated from its European customers after a period of 36 months when it contains personal dat
a. In the new architecture, this data will be stored in both Cloud Storage and BigQuery. What should you do?
The practice for managing logs generated on Compute Engine on Google Cloud is to install the Cloud Logging agent and send them to Cloud Logging.
The sent logs will be aggregated into a Cloud Logging sink and exported to Cloud Storage.
The reason for using Cloud Storage as the destination for the logs is that the requirement in question requires setting up a lifecycle based on the storage period.
In this case, the log will be used for active queries for 30 days after it is saved, but after that, it needs to be stored for a longer period of time for auditing purposes.
If the data is to be used for active queries, we can use BigQuery's Cloud Storage data query feature and move the data past 30 days to Coldline to build a cost-optimal solution.
Therefore, the correct answer is as follows
1. Install the Cloud Logging agent on all instances.
Create a sync that exports the logs to the region's Cloud Storage bucket.
3. Create an Object Lifecycle rule to move the files to the Coldline Cloud Storage bucket after one month. 4.
4. set up a bucket-level retention policy using bucket locking.'
Lottie
5 hours agoTyisha
1 days agoNovella
6 days ago