The code block displayed below contains an error. The code block should configure Spark to split data in 20 parts when exchanging data between executors for joins or aggregations. Find the error.
Code block:
spark.conf.set(spark.sql.shuffle.partitions, 20)
Correct code block:
spark.conf.set('spark.sql.shuffle.partitions', 20)
The code block expresses the option incorrectly.
Correct! The option should be expressed as a string.
The code block sets the wrong option.
No, spark.sql.shuffle.partitions is the correct option for the use case in the question.
The code block sets the incorrect number of parts.
Wrong, the code block correctly states 20 parts.
The code block uses the wrong command for setting an option.
No, in PySpark spark.conf.set() is the correct command for setting an option.
The code block is missing a parameter.
Incorrect, spark.conf.set() takes two parameters.
More info: Configuration - Spark 3.1.2 Documentation
Dusti
2 months agoIsaiah
2 months agoKimbery
21 days agoLang
22 days agoVicente
1 months agoBrande
1 months agoDevorah
2 months agoLauna
2 months agoDorathy
1 months agoColetta
2 months agoCheryl
2 months agoLouvenia
2 months agoChristiane
3 months agoVincent
2 months agoLawrence
2 months agoSharee
2 months ago