Answer Posted / Parvez Khan
When you submit a Spark job, the job is divided into multiple tasks and executed in parallel across a cluster of machines. Each task operates on a subset of the data and produces intermediate results that are combined to produce the final result.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers