Deal of The Day! Hurry Up, Grab the Special Discount - Save 25% - Ends In 00:00:00 Coupon code: SAVE25
Welcome to Pass4Success

- Free Preparation Discussions

Cloudera Exam CCA175 Topic 1 Question 61 Discussion

Actual exam question for Cloudera's CCA175 exam
Question #: 61
Topic #: 1
[All CCA175 Questions]

Problem Scenario 94 : You have to run your Spark application on yarn with each executor 20GB and number of executors should be 50.Please replace XXX, YYY, ZZZ

export HADOOP_CONF_DIR=XXX

./bin/spark-submit \

-class com.hadoopexam.MyTask \

xxx\

-deploy-mode cluster \ # can be client for client mode

YYY\

222 \

/path/to/hadoopexam.jar \

1000

Show Suggested Answer Hide Answer
Suggested Answer: A

Contribute your Thoughts:

Elly
4 months ago
I'd go with option A. Trying to outsmart the question by going for more memory and executors is a classic trap. Keep it simple, people!
upvoted 0 times
Teri
3 months ago
User 3: Option A it is then, thanks for the advice!
upvoted 0 times
...
Charolette
3 months ago
User 2: Agreed, keeping it simple is the way to go.
upvoted 0 times
...
Cecil
4 months ago
User 1: I'd go with option A.
upvoted 0 times
...
...
Josephine
4 months ago
Haha, this reminds me of the time I accidentally set the executor memory to 40GB and wondered why my cluster kept crashing. Option A for the win!
upvoted 0 times
Yoko
3 months ago
User 3: Definitely, Option A with 20GB executor memory is the correct choice.
upvoted 0 times
...
Leonida
3 months ago
User 2: Yeah, it's important to set it right. Option A is the way to go.
upvoted 0 times
...
Erinn
4 months ago
User 1: Haha, I did the same thing once with 40GB executor memory.
upvoted 0 times
...
...
Marget
4 months ago
Hmm, I'm not sure, but I think option A is the way to go. Doesn't hurt to double-check the requirements though, just to be safe.
upvoted 0 times
...
Audrie
5 months ago
Option A looks good to me. The settings seem to match the requirements in the problem statement.
upvoted 0 times
Kirby
3 months ago
-num-executors 50
upvoted 0 times
...
Elvis
4 months ago
I agree, Option A is the correct solution for running the Spark application on yarn with 20GB per executor and 50 executors.
upvoted 0 times
...
Daren
4 months ago
-executor-memory 20G
upvoted 0 times
...
Lizette
4 months ago
Option A looks good to me. The settings seem to match the requirements in the problem statement.
upvoted 0 times
...
Kristine
4 months ago
User 2: The settings seem to match the requirements in the problem statement.
upvoted 0 times
...
Tequila
4 months ago
-master yarn
upvoted 0 times
...
Nana
4 months ago
User 1: Option A looks good to me.
upvoted 0 times
...
Stephanie
4 months ago
Option A looks good to me. The settings seem to match the requirements in the problem statement.
upvoted 0 times
...
...
Kyoko
5 months ago
I think option A is the correct solution. The problem statement clearly asks for 20GB of memory per executor and 50 executors, which matches option A.
upvoted 0 times
Mohammad
5 months ago
Yes, option A matches the requirements given in the problem scenario.
upvoted 0 times
...
Denae
5 months ago
I agree, option A is the correct solution.
upvoted 0 times
...
...

Save Cancel