What are Accumulators?
Answer / Chhavi Singh
Accumulators in Spark allow you to compute and update a variable during the execution of an action. They can be used for iterative algorithms where you want to accumulate some value across each iteration. In PySpark, you can create an Accumulator using `acc = sc.accumulator(0)`. Then, update it inside map() or reduce() functions.
| Is This Answer Correct ? | 0 Yes | 0 No |
What is the difference between pyspark and spark?
How do I open pyspark shell in windows?
Name the parts of Spark Ecosystem?
What are activities ?
What is rdd in pyspark?
What are the enhancements that engineer can make while working with flash?
Notice a few Transformations and Actions?
Does pyspark work with python3?
What is the distinction among continue() and store()?
What is the job of store() and continue()?
What is Pyspark?
What are activities and changes?