Answer Posted / Arjun Tyagi
The Driver program in Apache Spark is the main application that creates an execution graph, launches the Executor processes, and manages their work. It's responsible for initializing SparkContext, reading input data, creating RDDs (Resilient Distributed Datasets), and specifying transformations and actions to be performed on the data.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers