Answer Posted / Sachin Gaurav
Spark repartition involves redistributing the RDD across different nodes. This operation can be used to balance the data distribution among nodes for better performance or to reduce memory consumption.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers