Answer Posted / Rohit Sah
In a Spark cluster setup, it's not necessary to install Apache Spark on all the nodes. Instead, you typically have a master node (where Spark runs the application manager) and multiple slave nodes (where executors run tasks). The number of slaves can be increased or decreased as needed.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers