Answer Posted / Rinku Gupta
Spark Cache should be used when you are processing large datasets multiple times and want to improve performance by keeping RDDs or DataFrames in memory between actions. It's useful for reducing the time taken to process the same data repeatedly.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers