Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Exam Professional Machine Learning Engineer Topic 1 Question 96 Discussion

Actual exam question for Google's Professional Machine Learning Engineer exam
Question #: 96
Topic #: 1
[All Professional Machine Learning Engineer Questions]

You are implementing a batch inference ML pipeline in Google Cloud. The model was developed by using TensorFlow and is stored in SavedModel format in Cloud Storage. You need to apply the model to a historical dataset that is stored in a BigQuery table. You want to perform inference with minimal effort. What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: B

Vertex AI batch prediction is the most appropriate and efficient way to apply a pre-trained model like TensorFlow's SavedModel to a large dataset, especially for batch processing.

The Vertex AI batch prediction job works by exporting your dataset (in this case, historical data from BigQuery) to a suitable format (like Avro or CSV) and then processing it in Cloud Storage where the model is stored.

Avro format is recommended for large datasets as it is highly efficient for data storage and is optimized for read/write operations in Google Cloud, which is why option B is correct.

Option A suggests using BigQuery ML for inference, but it does not support running arbitrary TensorFlow models directly within BigQuery ML. Hence, BigQuery ML is not a valid option for this particular task.

Option C (exporting to CSV) is a valid alternative but is less efficient compared to Avro in terms of performance.


Contribute your Thoughts:

Berry
5 hours ago
I'm feeling a bit Vertexed on this one. But Option D seems like the path of least resistance. Who needs to export data when you can just let Vertex AI do its thing directly in BigQuery?
upvoted 0 times
...
Tiffiny
11 days ago
Option B all the way! Avro format is the way to go - it's like the Ferrari of data formats. Vertex AI will be purring like a kitten with that data.
upvoted 0 times
...
Roxanne
13 days ago
I prefer option C. Exporting the data to Cloud Storage in CSV format and configuring a Vertex AI batch prediction job seems like a good solution to me.
upvoted 0 times
...
Joaquin
15 days ago
I'm not sure about option D. I think option B could also work well if we export the historical data to Cloud Storage in Avro format.
upvoted 0 times
...
Erick
18 days ago
Hmm, Option A sounds like a lot of work. Why bother with importing the TensorFlow model to BigQuery ML when you can just use Vertex AI for the heavy lifting?
upvoted 0 times
...
Earlean
19 days ago
I agree with Valentine. Option D sounds like the most straightforward approach to apply the model to the historical dataset.
upvoted 0 times
...
Asha
24 days ago
Option C is the way to go! Exporting the data to CSV and then using Vertex AI for batch prediction is a simple and straightforward solution.
upvoted 0 times
Sue
8 days ago
I agree, it's a simple and straightforward solution. No need to overcomplicate things.
upvoted 0 times
...
Gwenn
13 days ago
Option C is definitely the easiest way to go. Exporting to CSV and using Vertex AI for batch prediction is the way to go.
upvoted 0 times
...
...
Valentine
24 days ago
I think option D is the best choice. It seems like the most efficient way to get predictions from the historical data.
upvoted 0 times
...
Tegan
29 days ago
I'd go with Option D. Deploying a Vertex AI endpoint and using it to get predictions directly from BigQuery sounds like the easiest way to apply the model to the historical data.
upvoted 0 times
...
Amber
1 months ago
Option B seems the most efficient approach. Exporting the data to Cloud Storage and using Vertex AI for batch prediction is a great way to leverage the model without complicated integration.
upvoted 0 times
Cyril
18 days ago
I agree, using Vertex AI for batch prediction will make it easier to apply the model to the historical dataset.
upvoted 0 times
...
Corinne
20 days ago
I think option B is the way to go. Exporting the data to Cloud Storage and using Vertex AI for batch prediction seems efficient.
upvoted 0 times
...
...

Save Cancel