Answer Posted / Avni Gupta
The architecture of Apache Spark consists of the following components: Resilient Distributed Datasets (RDD), Directed Acyclic Graph (DAG), and Spark Driver Program. RDDs are the basic building block of Spark, a distributed collection of objects. DAG is a graphical representation of a computational problem, which describes how tasks in Spark are scheduled and executed. The Spark Driver Program acts as an entry point to submit jobs to the cluster.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers