Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Exam Professional Cloud Architect Topic 2 Question 97 Discussion

Actual exam question for Google's Professional Cloud Architect exam
Question #: 97
Topic #: 2
[All Professional Cloud Architect Questions]

Your company has an application that is running on multiple instances of Compute Engine. It generates 1 TB per day of logs. For compliance reasons, the logs need to be kept for at least two years. The logs need to be available for active query for 30 days. After that, they just need to be retained for audit purposes. You want to implement a storage solution that is compliant, minimizes costs, and follows Google-recommended practices. What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: B

The practice for managing logs generated on Compute Engine on Google Cloud is to install the Cloud Logging agent and send them to Cloud Logging.

The sent logs will be aggregated into a Cloud Logging sink and exported to Cloud Storage.

The reason for using Cloud Storage as the destination for the logs is that the requirement in question requires setting up a lifecycle based on the storage period.

In this case, the log will be used for active queries for 30 days after it is saved, but after that, it needs to be stored for a longer period of time for auditing purposes.

If the data is to be used for active queries, we can use BigQuery's Cloud Storage data query feature and move the data past 30 days to Coldline to build a cost-optimal solution.

Therefore, the correct answer is as follows

1. Install the Cloud Logging agent on all instances.

Create a sync that exports the logs to the region's Cloud Storage bucket.

3. Create an Object Lifecycle rule to move the files to the Coldline Cloud Storage bucket after one month. 4.

4. set up a bucket-level retention policy using bucket locking.'


Contribute your Thoughts:

Jonell
2 months ago
I'm just hoping this exam doesn't turn into a 'log'-jam of questions. Get it? Log-jam? Oh, never mind...
upvoted 0 times
Tashia
30 days ago
D) 1. Write a daily cron job, running on all instances, that uploads logs into a Cloud Storage bucket. 2. Create a sink to export logs into a regional Cloud Storage bucket. 3. Create an Object Lifecycle rule to move files into a Coldline Cloud Storage bucket after one month.
upvoted 0 times
...
Fannie
1 months ago
B) 1. Install the Cloud Ops agent on all instances. 2. Create a sink to export logs into a regional Cloud Storage bucket. 3. Create an Object Lifecycle rule to move files into a Coldline Cloud Storage bucket after one month. 4. Configure a retention policy at the bucket level to create a lock.
upvoted 0 times
...
Chantell
1 months ago
A) 1. Install the Cloud Ops agent on all instances. 2. Create a sink to export logs into a partitioned BigQuery table. 3. Set a time_partitioning_expiration of 30 days.
upvoted 0 times
...
...
Ashlyn
2 months ago
Option B all the way! It's like playing a game of 'Log Tetris' - gotta keep those blocks (er, files) organized and cost-efficient.
upvoted 0 times
...
Romana
2 months ago
Option C is a no-go. Running a daily cron job on all instances to upload logs? That's just asking for trouble. Let's stick to the Google-recommended practices.
upvoted 0 times
Golda
17 days ago
B) 4. Configure a retention policy at the bucket level to create a lock.
upvoted 0 times
...
Hector
19 days ago
B) 3. Create an Object Lifecycle rule to move files into a Coldline Cloud Storage bucket after one month.
upvoted 0 times
...
Howard
20 days ago
B) 2. Create a sink to export logs into a regional Cloud Storage bucket.
upvoted 0 times
...
Isadora
23 days ago
B) 1. Install the Cloud Ops agent on all instances.
upvoted 0 times
...
Lashaun
1 months ago
4. Configure a retention policy at the bucket level to create a lock.
upvoted 0 times
...
Caren
2 months ago
3. Create an Object Lifecycle rule to move files into a Coldline Cloud Storage bucket after one month.
upvoted 0 times
...
Effie
2 months ago
2. Create a sink to export logs into a regional Cloud Storage bucket.
upvoted 0 times
...
Twana
2 months ago
B) 1. Install the Cloud Ops agent on all instances.
upvoted 0 times
...
...
Caprice
3 months ago
Option D seems a bit convoluted. Why not just use the Cloud Ops agent to export logs directly to a Cloud Storage bucket? That's the easiest way to go.
upvoted 0 times
Carli
1 months ago
That's a good point. Option B seems to cover all the requirements for compliance and cost optimization.
upvoted 0 times
...
Rosenda
2 months ago
But wouldn't it be better to have a retention policy in place to ensure compliance and cost-effectiveness?
upvoted 0 times
...
Ryan
2 months ago
I agree, using the Cloud Ops agent for direct export sounds simpler and more efficient.
upvoted 0 times
...
Jillian
2 months ago
Option D seems a bit convoluted. Why not just use the Cloud Ops agent to export logs directly to a Cloud Storage bucket? That's the easiest way to go.
upvoted 0 times
...
...
Gail
3 months ago
I agree with you, B seems to be the most cost-effective solution.
upvoted 0 times
...
Carman
3 months ago
I'm leaning towards Option A. Exporting logs to a partitioned BigQuery table seems like a simpler and more efficient solution than dealing with multiple Cloud Storage buckets.
upvoted 0 times
...
Ressie
3 months ago
I think the best option is B.
upvoted 0 times
...
Starr
3 months ago
Option B looks like the best choice. Exporting logs to a regional Cloud Storage bucket and then moving them to Coldline after a month is a good way to balance accessibility and cost-effectiveness.
upvoted 0 times
Rolf
1 months ago
Setting a retention policy at the bucket level adds an extra layer of security for audit purposes.
upvoted 0 times
...
Margarett
1 months ago
Creating an Object Lifecycle rule to move files to Coldline after a month is a smart way to save on storage costs.
upvoted 0 times
...
Kayleigh
1 months ago
Installing the Cloud Ops agent on all instances is a crucial step in ensuring all logs are captured.
upvoted 0 times
...
Mari
2 months ago
I agree, option B seems like the most cost-effective solution while still meeting compliance requirements.
upvoted 0 times
...
Marguerita
2 months ago
Yes, using Object Lifecycle rules to automatically move files to Coldline Cloud Storage after a month is a smart way to manage costs while still meeting compliance requirements.
upvoted 0 times
...
Kimbery
2 months ago
I agree, it's important to have the logs accessible for active query for 30 days but then move them to a more cost-effective storage solution for audit purposes.
upvoted 0 times
...
Luis
2 months ago
Option B looks like the best choice. Exporting logs to a regional Cloud Storage bucket and then moving them to Coldline after a month is a good way to balance accessibility and cost-effectiveness.
upvoted 0 times
...
...

Save Cancel