Which tells spark how and where to access the cluster
Answers
Answered by
1
At a high level, every Spark application consists of a driver program that runs the user’s main function and executes various parallel operations on a cluster. The main abstraction Spark provides is a resilient distributed data set which is a collection of elements partitioned across the nodes of the cluster that can be operated on in parallel are created by starting with a file in the Hadoop file system or an existing Scala collection in the driver program, and transforming it.
Similar questions