Problem Scenario 70 : Write down a Spark Application using Python, In which it read a file "Content.txt" (On hdfs) with following content. Do the word count and save the results in a directory called "problem85" (On hdfs)
Content.txt
Apache Spark Training
This is Spark Learning Session
Spark is faster than MapReduce
Currently there are no comments in this discussion, be the first to comment!