Answer Posted / Vinay Kumar Sharma
An accumulator in Spark is a variable that can be updated by each task during a computation. It allows you to store a value that accumulates over the course of the application, but it does not persist across iterations. Accumulators are useful for aggregating results from multiple tasks.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers