Answer Posted / Rinku Kumar
Spark Stages are the logical components that execute Spark tasks. Each job in Apache Spark consists of multiple stages, where each stage breaks down the task into smaller subtasks. The most common stages include ShuffleMap, ShuffleWrite, Map, Reduce, and Aggregate. Each stage is further divided into tasks that run on worker nodes.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers