Cyber Monday 2024! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Google Exam Professional Cloud Architect Topic 11 Question 84 Discussion

Actual exam question for Google's Professional Cloud Architect exam
Question #: 84
Topic #: 11
[All Professional Cloud Architect Questions]

Your company wants to migrate their 10-TB on-premises database export into Cloud Storage You want to minimize the time it takes to complete this activity, the overall cost and database load The bandwidth between the on-premises environment and Google Cloud is 1 Gbps You want to follow Google-recommended practices What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: A

The Data Transfer appliance is a Google-provided hardware device that can be used to transfer large amounts of data from on-premises environments to Cloud Storage. It is suitable for scenarios where the bandwidth between the on-premises environment and Google Cloud is low or insufficient, and the data size is large. The Data Transfer appliance can minimize the time it takes to complete the migration, the overall cost and database load, by avoiding network bottlenecks and reducing bandwidth consumption. The Data Transfer appliance also encrypts the data at rest and in transit, ensuring data security and privacy. The other options are not optimal for this scenario, because they either require a high-bandwidth network connection (B, C, D), or incur additional costs and complexity (B, C). Reference:

https://cloud.google.com/data-transfer-appliance/docs/overview

https://cloud.google.com/blog/products/storage-data-transfer/introducing-storage-transfer-service-for-on-premises-data


Contribute your Thoughts:

Vicente
5 months ago
I'm with Cheryl on this one. Dataflow is the way to go. It's fast, efficient, and you can't beat Google's recommended practices. Although, I do wonder if they'll give us a free Data Transfer appliance as a reward for choosing the 'right' answer.
upvoted 0 times
...
Wilbert
5 months ago
Haha, Claribel, Dataflow may be a bit more complex, but it's probably worth it for a 10-TB migration. Plus, who doesn't love some good old-fashioned 'cloud magic'?
upvoted 0 times
Kerrie
4 months ago
C) Develop a Dataflow job to read data directly from the database and write it into Cloud Storage
upvoted 0 times
...
Bettye
5 months ago
A) Use the Data Transfer appliance to perform an offline migration
upvoted 0 times
...
Carey
5 months ago
C) Develop a Dataflow job to read data directly from the database and write it into Cloud Storage
upvoted 0 times
...
In
5 months ago
A) Use the Data Transfer appliance to perform an offline migration
upvoted 0 times
...
...
Claribel
6 months ago
Hmm, I'm not sure about Dataflow. Isn't that thing a bit complicated? I'm leaning more towards Option D - compressing the data and using gsutil seems straightforward and should get the job done.
upvoted 0 times
Glory
5 months ago
Yeah, compressing the data and using gsutil sounds like a good plan.
upvoted 0 times
...
Adelina
5 months ago
I agree, Option D seems more straightforward.
upvoted 0 times
...
Whitley
5 months ago
I think Dataflow might be a bit complicated.
upvoted 0 times
...
...
Cheryl
6 months ago
I agree, Dataflow seems like the way to go. It's a Google-recommended practice and will probably be the most cost-effective solution for a 10-TB migration.
upvoted 0 times
Lauryn
5 months ago
I agree, Dataflow seems like the way to go. It's a Google-recommended practice and will probably be the most cost-effective solution for a 10-TB migration.
upvoted 0 times
...
Gearldine
5 months ago
C) Develop a Dataflow job to read data directly from the database and write it into Cloud Storage
upvoted 0 times
...
Reita
6 months ago
A) Use the Data Transfer appliance to perform an offline migration
upvoted 0 times
...
...
Ernestine
6 months ago
I agree with Chantell, it will help minimize the time and cost.
upvoted 0 times
...
Minna
6 months ago
Option C looks like the best choice here. A Dataflow job can directly read from the database and write to Cloud Storage, which should be faster and more efficient than the other options.
upvoted 0 times
Aliza
5 months ago
Yes, using a Dataflow job to read data directly from the database and write it into Cloud Storage should help minimize the time and cost of the migration.
upvoted 0 times
...
Luz
5 months ago
I agree, Option C seems like the most efficient choice for this scenario.
upvoted 0 times
...
Lachelle
5 months ago
Yes, using a Dataflow job to directly read and write data to Cloud Storage would definitely help minimize the time and cost of the migration.
upvoted 0 times
...
Ressie
6 months ago
I agree, Option C seems like the most efficient choice for this scenario.
upvoted 0 times
...
Glynda
6 months ago
Yes, using a Dataflow job to read data directly from the database and write it into Cloud Storage would definitely help minimize the time and cost.
upvoted 0 times
...
Jamal
6 months ago
I agree, Option C seems like the most efficient choice for this migration.
upvoted 0 times
...
...
Chantell
6 months ago
I think we should use the Data Transfer appliance for offline migration.
upvoted 0 times
...

Save Cancel