Answer Posted / Aditya Kumar Singh
An accumulator in Spark is a variable that can be updated across multiple tasks. It allows you to compute aggregated statistics during the execution of your application.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers