Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Microsoft Exam DP-700 Topic 3 Question 8 Discussion

Actual exam question for Microsoft's DP-700 exam
Question #: 8
Topic #: 3
[All DP-700 Questions]

You need to schedule the population of the medallion layers to meet the technical requirements.

What should you do?

Show Suggested Answer Hide Answer
Suggested Answer: A

The technical requirements specify that:

Why Use a Data Pipeline That Calls Other Data Pipelines?

- Sequential execution of child pipelines.

- Error handling to send email notifications upon failures.

- Parallel execution of tasks where possible (e.g., simultaneous imports into the bronze layer).


Contribute your Thoughts:

Amie
1 days ago
You know, I was just thinking about how great it would be if we could schedule a data pipeline that could also perform stand-up comedy. That would really liven up the technical requirements.
upvoted 0 times
...
Stephanie
2 days ago
Hold up, guys. What if we combine options A and D? Scheduling a data pipeline that calls other data pipelines could be the perfect way to orchestrate the whole process.
upvoted 0 times
...
Armanda
1 months ago
Hmm, I'm not sure about that. Scheduling a notebook seems a bit too simple for this task. I'd go with option C and schedule an Apache Spark job instead.
upvoted 0 times
Clement
6 days ago
I agree, I would go with scheduling an Apache Spark job.
upvoted 0 times
...
Moira
12 days ago
I think scheduling a notebook might not be enough for this task.
upvoted 0 times
...
...
Gerald
1 months ago
But wouldn't scheduling multiple data pipelines provide more flexibility and scalability?
upvoted 0 times
...
Stephania
1 months ago
I disagree, I believe scheduling an Apache Spark job would be more efficient.
upvoted 0 times
...
Davida
1 months ago
I think option D is the way to go. Scheduling multiple data pipelines seems like the most comprehensive solution to meet the technical requirements.
upvoted 0 times
Janessa
16 days ago
I agree, scheduling multiple data pipelines is the most efficient way to meet the technical requirements.
upvoted 0 times
...
Vivan
17 days ago
Option D is definitely the best choice. It covers all bases.
upvoted 0 times
...
...
Gerald
1 months ago
I think we should schedule a data pipeline that calls other data pipelines.
upvoted 0 times
...

Save Cancel