What are the different dimensions of constancy in Apache Spark?
Answer Posted / Jagrit Kashyap
The three dimensions of persistence in Spark are: 1) Storage Level: Determines the persistence level for RDDs and DStreams. There are levels like MEMORY_ONLY, MEMORY_ONLY_SER, MEMORY_AND_DISK, and DISK_ONLY. 2) Block Replication: Controls the number of replicas kept per block across nodes in a cluster. 3) Checkpointing: Allows for periodically saving the state of RDDs to disk for fault tolerance and faster recovery.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers