Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Exam Professional Cloud Architect Topic 6 Question 100 Discussion

Actual exam question for Google's Professional Cloud Architect exam
Question #: 100
Topic #: 6
[All Professional Cloud Architect Questions]

Your company has a Google Cloud project that uses BigQuery for data warehousing on a pay-per-use basis. You want to monitor queries in real time to discover the most costly queries and which users spend the most. What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: C

https://cloud.google.com/blog/products/data-analytics/taking-a-practical-approach-to-bigquery-cost-monitoring


Contribute your Thoughts:

Jacki
17 days ago
Option C, huh? I guess they're really trying to squeeze every penny out of our BigQuery usage. Better keep those queries lean and mean!
upvoted 0 times
...
Vallie
19 days ago
Option D? More like Option 'Doh!' Am I right? Someone's been drinking too much cloud Kool-Aid.
upvoted 0 times
Jaime
1 days ago
A) Create a Cloud Logging sink to export BigQuery data access logs to Cloud Storage. Develop a Dataflow pipeline to compute the cost of queries split by users.
upvoted 0 times
...
...
Arlene
22 days ago
I'd go with Option B. Keeping the logs in BigQuery just makes sense - no need to export them anywhere else. Efficient and straightforward.
upvoted 0 times
...
Marvel
26 days ago
Option A gets my vote. Dataflow can crunch the numbers and give us the insights we need. Plus, Storage is much cooler than BigQuery, right?
upvoted 0 times
...
Eric
28 days ago
Option D is way too complicated. Who has time to mess with all those labels and Billing reports? No thanks!
upvoted 0 times
Quentin
6 hours ago
B) 1. Create a Cloud Logging sink to export BigQuery data access logs to BigQuery. 2. Perform a BigQuery query on the generated table to extract the information you need.
upvoted 0 times
...
Rosalind
3 days ago
A) 1. Create a Cloud Logging sink to export BigQuery data access logs to Cloud Storage. 2. Develop a Dataflow pipeline to compute the cost of queries split by users.
upvoted 0 times
...
...
Ria
1 months ago
Option C looks good to me. Tapping into the billing data should give us the details we need on query costs and user activity.
upvoted 0 times
Rasheeda
15 days ago
2. Perform a BigQuery query on the billing table to extract the information you need.
upvoted 0 times
...
Beckie
24 days ago
C) 1. Activate billing export into BigQuery.
upvoted 0 times
...
...
Deeanna
1 months ago
I'm not sure, I think option B could also work. Exporting BigQuery data access logs to BigQuery and performing a query on the generated table might provide the information we need.
upvoted 0 times
...
Charolette
1 months ago
I think Option B is the way to go. Exporting the logs directly to BigQuery makes the data more accessible for querying.
upvoted 0 times
Joesph
17 days ago
Let's go with Option B then. It's a straightforward solution to analyze costly queries and user spending.
upvoted 0 times
...
Katlyn
29 days ago
I agree, it seems like the most efficient way to monitor queries and track user activity.
upvoted 0 times
...
Paulina
1 months ago
Option B is a good choice. Exporting logs to BigQuery will make it easier to extract the information we need.
upvoted 0 times
...
...
Sabine
1 months ago
I agree with Lacey. Option A seems like the most efficient way to monitor queries in real time and identify the most costly queries and users.
upvoted 0 times
...
Lacey
1 months ago
I think option A is the best choice because it involves exporting BigQuery data access logs to Cloud Storage and developing a Dataflow pipeline to compute the cost of queries split by users.
upvoted 0 times
...
Lavonna
2 months ago
Option A seems the most comprehensive approach to monitoring BigQuery queries and costs in real-time.
upvoted 0 times
Daisy
26 days ago
Yes, it would definitely help in identifying the most costly queries and which users are spending the most.
upvoted 0 times
...
Oren
28 days ago
I agree, creating a Cloud Logging sink to export BigQuery data access logs to Cloud Storage and developing a Dataflow pipeline sounds efficient.
upvoted 0 times
...
Gayla
29 days ago
Option A seems the most comprehensive approach to monitoring BigQuery queries and costs in real-time.
upvoted 0 times
...
...

Save Cancel