Answer Posted / Prem Kumar Sharma
In Apache Spark, a Worker Node is a machine or node that runs tasks and applications. Each Worker Node has one or more executors to process data in parallel.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers