Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Exam Professional Cloud Developer Topic 11 Question 99 Discussion

Actual exam question for Google's Professional Cloud Developer exam
Question #: 99
Topic #: 11
[All Professional Cloud Developer Questions]

You are developing an online gaming platform as a microservices application on Google Kubernetes Engine (GKE). Users on social media are complaining about long loading times for certain URL requests to the application. You need to investigate performance bottlenecks in the application and identify which HTTP requests have a significantly high latency span in user requests. What should you do9

Show Suggested Answer Hide Answer
Suggested Answer: D

Contribute your Thoughts:

Arlette
2 months ago
Option D could be interesting too. Getting metrics from the GKE cluster directly might provide a good overall picture of the performance.
upvoted 0 times
Curtis
28 days ago
D) Configure GKE workload metrics using kubect1. Select all Pods to send their metrics to Cloud Monitoring. Create a custom dashboard of application metrics in Cloud Monitoring to determine performance bottlenecks of your GKE cluster.
upvoted 0 times
...
Sheridan
1 months ago
C) Instrument your microservices by installing the Open Telemetry tracing package. Update your application code to send traces to Trace for inspection and analysis. Create an analysis report on Trace to analyze user requests.
upvoted 0 times
...
Otis
1 months ago
A) Update your microservices lo log HTTP request methods and URL paths to STDOUT Use the logs router to send container logs to Cloud Logging. Create fillers in Cloud Logging to evaluate the latency of user requests across different methods and URL paths.
upvoted 0 times
...
...
Loreta
2 months ago
Hah, option B with Wireshark? What is this, the 90s? We're in the cloud, people. Option C is the modern solution.
upvoted 0 times
Annamaria
1 months ago
Yeah, using Open Telemetry tracing package is more efficient for investigating latency in user requests.
upvoted 0 times
...
Venita
2 months ago
I agree, Option C is definitely the modern solution for analyzing performance bottlenecks.
upvoted 0 times
...
Karrie
2 months ago
Option B with Wireshark is outdated. Option C with Open Telemetry tracing is the way to go.
upvoted 0 times
...
...
Yasuko
2 months ago
Ah, the old 'throw more metrics at it' approach. Sometimes simple logging is all you need, my friend.
upvoted 0 times
Emile
1 months ago
C) Instrument your microservices by installing the Open Telemetry tracing package. Update your application code to send traces to Trace for inspection and analysis. Create an analysis report on Trace to analyze user requests
upvoted 0 times
...
Sabra
2 months ago
A) Update your microservices lo log HTTP request methods and URL paths to STDOUT Use the logs router to send container logs to Cloud Logging. Create fillers in Cloud Logging to evaluate the latency of user requests across different methods and URL paths.
upvoted 0 times
...
...
Theodora
2 months ago
I think configuring GKE workload metrics in Cloud Monitoring is the way to go to identify performance bottlenecks.
upvoted 0 times
...
Carmela
3 months ago
Option C looks like the way to go. Tracing with Open Telemetry is a powerful tool for understanding performance issues in a microservices architecture.
upvoted 0 times
Valentine
2 months ago
I agree, using Open Telemetry for tracing will definitely help us pinpoint the performance bottlenecks in our application.
upvoted 0 times
...
Lindsey
2 months ago
Option C looks like the way to go. Tracing with Open Telemetry is a powerful tool for understanding performance issues in a microservices architecture.
upvoted 0 times
...
...
Lizette
3 months ago
I prefer installing the Open Telemetry tracing package to analyze user requests.
upvoted 0 times
...
Lashaunda
3 months ago
I agree with Adolph. Using Cloud Logging to evaluate latency sounds like a good idea.
upvoted 0 times
...
Adolph
3 months ago
I think we should update our microservices to log HTTP requests and analyze the latency.
upvoted 0 times
...

Save Cancel