is it necessary to install Spark on all nodes while running Spark application on Yarn?
Answer Posted / Amit Chauhan
It is not necessary to install Spark on all nodes while running a Spark application on YARN. The worker nodes (also known as DataNodes in Hadoop) will have the required resources and Spark installed.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers