Answer Posted / Dhiraj Singh
1. Local mode: Spark runs on your local machine using only one executor (driver node). 2. Cluster mode: Spark runs on a Hadoop cluster, where each worker node becomes an executor. 3. Standalone mode: Spark has its own resource manager and job scheduler instead of relying on YARN or Mesos.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers