Answer Posted / Dhirendra Kumar Rathore
A Spark Standalone cluster is a deployment mode for Apache Spark that manages the allocation of resources among multiple applications. It consists of a master (Spark's cluster manager) and one or more worker nodes. The master node schedules tasks across available workers and monitors their status.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers