Cyber Monday 2024! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft Exam DP-203 Topic 8 Question 78 Discussion

Actual exam question for Microsoft's DP-203 exam
Question #: 78
Topic #: 8
[All DP-203 Questions]

You have two Azure Blob Storage accounts named account1 and account2?

You plan to create an Azure Data Factory pipeline that will use scheduled intervals to replicate newly created or modified blobs from account1 to account?

You need to recommend a solution to implement the pipeline. The solution must meet the following requirements:

* Ensure that the pipeline only copies blobs that were created of modified since the most recent replication event.

* Minimize the effort to create the pipeline.

What should you recommend?

Show Suggested Answer Hide Answer
Suggested Answer: A

Contribute your Thoughts:

Brice
8 months ago
Hmm, I don't know. A Data Flow activity seems like overkill for this scenario. I'd be worried that it would be more complex to set up than necessary.
upvoted 0 times
...
Sherman
8 months ago
Hold on, I'm not so sure about that. Option D, the Built-in copy task, might be a simpler solution. It doesn't have the same level of automation, but it could still get the job done with less effort.
upvoted 0 times
...
Jerrod
8 months ago
I think option C is the way to go. The Metadata-driven copy task in the Copy Data tool should automatically detect new and modified blobs, which meets the first requirement. And it's probably the easiest to set up, which addresses the second requirement.
upvoted 0 times
...
Salina
8 months ago
This question seems straightforward, but it's important to understand the specific requirements. We need a solution that only copies new or modified blobs, and minimizes the effort to create the pipeline.
upvoted 0 times
...

Save Cancel