The code block displayed below contains an error. The code block should configure Spark to split data in 20 parts when exchanging data between executors for joins or aggregations. Find the error.
Code block:
spark.conf.set(spark.sql.shuffle.partitions, 20)
Correct code block:
spark.conf.set('spark.sql.shuffle.partitions', 20)
The code block expresses the option incorrectly.
Correct! The option should be expressed as a string.
The code block sets the wrong option.
No, spark.sql.shuffle.partitions is the correct option for the use case in the question.
The code block sets the incorrect number of parts.
Wrong, the code block correctly states 20 parts.
The code block uses the wrong command for setting an option.
No, in PySpark spark.conf.set() is the correct command for setting an option.
The code block is missing a parameter.
Incorrect, spark.conf.set() takes two parameters.
More info: Configuration - Spark 3.1.2 Documentation
Dusti
3 months agoIsaiah
3 months agoKimbery
2 months agoLang
2 months agoVicente
2 months agoBrande
3 months agoDevorah
3 months agoLauna
3 months agoDorathy
3 months agoColetta
3 months agoCheryl
3 months agoLouvenia
4 months agoChristiane
4 months agoVincent
3 months agoLawrence
3 months agoSharee
3 months ago