Answer Posted / Raghuveer Singh
Spark Core is the foundation of Apache Spark, providing essential components for distributed computing. It includes: (1) RDD (Resilient Distributed Datasets), an immutable distributed collection of data that can be processed in parallel. (2) Scheduler, responsible for task management and resource allocation. (3) Executors, where tasks are executed by the worker nodes. (4) Cluster Manager, handling the communication between the driver program and the executor processes.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers