Do you need to install Spark on all nodes of Yarn cluster while running Spark on Yarn?
Answer Posted / Naveen Kumar Sonia
No, when running Spark on Yarn (Yet Another Resource Manager), you only need to install the Spark application on the client nodes that will run your tasks. The resource manager (YARN) and Hadoop Distributed File System (HDFS) are already available in the cluster.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers