To use Spark on an existing Hadoop Cluster, do we need to install Spark on all nodes of Hadoop?
Answer Posted / Nikhil Verma
No, you don't have to install Spark on all nodes. You can run Spark applications using the YARN resource manager, which is part of Hadoop. However, you need to ensure that all the necessary dependencies are available and properly configured.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers