Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Exam Professional Machine Learning Engineer Topic 1 Question 94 Discussion

Actual exam question for Google's Professional Machine Learning Engineer exam
Question #: 94
Topic #: 1
[All Professional Machine Learning Engineer Questions]

You are implementing a batch inference ML pipeline in Google Cloud. The model was developed by using TensorFlow and is stored in SavedModel format in Cloud Storage. You need to apply the model to a historical dataset that is stored in a BigQuery table. You want to perform inference with minimal effort. What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: B

Vertex AI batch prediction is the most appropriate and efficient way to apply a pre-trained model like TensorFlow's SavedModel to a large dataset, especially for batch processing.

The Vertex AI batch prediction job works by exporting your dataset (in this case, historical data from BigQuery) to a suitable format (like Avro or CSV) and then processing it in Cloud Storage where the model is stored.

Avro format is recommended for large datasets as it is highly efficient for data storage and is optimized for read/write operations in Google Cloud, which is why option B is correct.

Option A suggests using BigQuery ML for inference, but it does not support running arbitrary TensorFlow models directly within BigQuery ML. Hence, BigQuery ML is not a valid option for this particular task.

Option C (exporting to CSV) is a valid alternative but is less efficient compared to Avro in terms of performance.


Contribute your Thoughts:

Truman
19 days ago
I prefer option C. Exporting the data to Cloud Storage in CSV format and configuring a Vertex AI batch prediction job seems like a simple solution.
upvoted 0 times
...
Lavonne
20 days ago
I don't know, options A and D both sound like they involve a lot of moving parts. Why not just go with the straightforward Cloud Storage export and Vertex AI batch prediction? Can't beat the classics!
upvoted 0 times
Deeann
9 days ago
Let's go with the classic approach of exporting to Cloud Storage and using Vertex AI batch prediction. It's reliable.
upvoted 0 times
...
Willetta
11 days ago
Yeah, Option C also involves exporting to Cloud Storage and using Vertex AI batch prediction. It's a solid choice.
upvoted 0 times
...
Melynda
12 days ago
I agree, keeping it simple with Cloud Storage export and Vertex AI batch prediction is the way to go.
upvoted 0 times
...
Tonja
14 days ago
Option B seems like the best choice. Export data to Cloud Storage and use Vertex AI batch prediction.
upvoted 0 times
...
...
Gene
22 days ago
I'm not sure about option D. I think option B could also work well if we export the historical data to Cloud Storage in Avro format.
upvoted 0 times
...
Stefania
1 months ago
Ha, BigQuery ML and TensorFlow in the same sentence? That's a recipe for a headache if I ever saw one. Option B or C for me, keep it simple!
upvoted 0 times
...
Anissa
1 months ago
Hmm, I'm not sure about option A. Trying to import the TensorFlow model into BigQuery ML seems like it might be more trouble than it's worth. I'd probably go with option C or D.
upvoted 0 times
Jesusita
12 days ago
Yeah, I think option C or D would be easier to implement for the batch inference ML pipeline.
upvoted 0 times
...
Shasta
20 days ago
I agree, option A does seem like it could be complicated.
upvoted 0 times
...
...
Maile
1 months ago
I'm leaning towards option D. Deploying a Vertex AI endpoint and using it to get predictions directly from the BigQuery data sounds like the easiest and most streamlined approach.
upvoted 0 times
Jaime
11 days ago
Let's go with option D then. Deploying a Vertex AI endpoint seems like the easiest way to get predictions.
upvoted 0 times
...
Sabine
16 days ago
It definitely sounds like the most streamlined approach. Option D is the way to go.
upvoted 0 times
...
Theron
25 days ago
I agree, deploying an endpoint and getting predictions directly from BigQuery is the way to go.
upvoted 0 times
...
Elliott
26 days ago
Option D seems like the best choice. Using a Vertex AI endpoint for predictions is efficient.
upvoted 0 times
...
...
Maybelle
1 months ago
I agree with Whitney. Option D sounds like the most straightforward approach to apply the model to the historical dataset.
upvoted 0 times
...
Simona
1 months ago
Option B seems like the most efficient choice here. Exporting the data to Cloud Storage in Avro format and then using Vertex AI batch prediction is a straightforward way to apply the TensorFlow model without having to do too much manual setup.
upvoted 0 times
Cristina
11 days ago
I agree. It's always best to choose the most efficient option when working with ML pipelines.
upvoted 0 times
...
Graham
26 days ago
Definitely, using Vertex AI batch prediction will save us a lot of time and effort.
upvoted 0 times
...
Jamika
1 months ago
That sounds like a good plan. It should make the process easier.
upvoted 0 times
...
Staci
1 months ago
B) Export the historical data to Cloud Storage in Avro format. Configure a Vertex AI batch prediction job to generate predictions for the exported data.
upvoted 0 times
...
...
Whitney
2 months ago
I think option D is the best choice. It seems like the most efficient way to get predictions from the historical data.
upvoted 0 times
...

Save Cancel