Do you need to install spark on all nodes of yarn cluster?
Answer Posted / Desh Deepak
When running a Spark application on a YARN cluster, it is not necessary to install Spark on all nodes. Instead, only the resource manager and worker nodes with required resources (memory and CPU) should have Spark installed. The rest of the worker nodes can continue to run other YARN applications.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers