Answer Posted / Piyush Chaudhary
A Spark job consists of one or more stages. Each stage contains zero or more tasks. In other words, a stage divides the work into smaller chunks and assigns them to tasks which are then executed on different worker nodes.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers