Answer Posted / Surendra Pal
"Accumulators are variables that can be updated by the tasks running on an executor during a Spark job. They allow a simple form of side effect computation where a value is collected across multiple data partitions and aggregated into one final result. Accumulators can be used to track counters, or for custom computations like finding the maximum or minimum value in a dataset.".
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers