The code block shown below should set the number of partitions that Spark uses when shuffling data for joins or aggregations to 100. Choose the answer that correctly fills the blanks in the code
block to accomplish this.
spark.sql.shuffle.partitions
__1__.__2__.__3__(__4__, 100)
Correct code block:
spark.conf.set('spark.sql.shuffle.partitions', 20)
The code block expresses the option incorrectly.
Correct! The option should be expressed as a string.
The code block sets the wrong option.
No, spark.sql.shuffle.partitions is the correct option for the use case in the question.
The code block sets the incorrect number of parts.
Wrong, the code block correctly states 20 parts.
The code block uses the wrong command for setting an option.
No, in PySpark spark.conf.set() is the correct command for setting an option.
The code block is missing a parameter.
Incorrect, spark.conf.set() takes two parameters.
More info: Configuration - Spark 3.1.2 Documentation
Sonia
1 months agoReita
1 months agoBobbye
1 months agoWillard
2 months agoYolande
8 days agoTwana
9 days agoBettye
25 days agoHeike
26 days agoLouvenia
28 days agoJade
29 days agoBoris
3 months agoCatina
2 months agoReena
2 months agoVallie
3 months agoSelma
3 months agoJolanda
3 months ago