The code block displayed below contains an error. The code block should configure Spark to split data in 20 parts when exchanging data between executors for joins or aggregations. Find the error.
Code block:
spark.conf.set(spark.sql.shuffle.partitions, 20)
Correct code block:
spark.conf.set('spark.sql.shuffle.partitions', 20)
The code block expresses the option incorrectly.
Correct! The option should be expressed as a string.
The code block sets the wrong option.
No, spark.sql.shuffle.partitions is the correct option for the use case in the question.
The code block sets the incorrect number of parts.
Wrong, the code block correctly states 20 parts.
The code block uses the wrong command for setting an option.
No, in PySpark spark.conf.set() is the correct command for setting an option.
The code block is missing a parameter.
Incorrect, spark.conf.set() takes two parameters.
More info: Configuration - Spark 3.1.2 Documentation
Dusti
15 days agoIsaiah
24 days agoVicente
6 days agoBrande
11 days agoDevorah
26 days agoLauna
27 days agoDorathy
8 days agoColetta
19 days agoCheryl
1 months agoLouvenia
1 months agoChristiane
2 months agoVincent
1 months agoLawrence
1 months agoSharee
1 months ago