How do I clear my spark cache?
Answer / Dheeraj Kumar Singh
To clear the Spark cache, you can call the unpersist() or stop() method on the RDD or DataFrame. For example:ntval rdd = sc.textFile("file.txt")ntrdd.cache() // cache the RDDntrdd.unpersist() // clear the cache
| Is This Answer Correct ? | 0 Yes | 0 No |
Describe Partition and Partitioner in Apache Spark?
What is difference between client and cluster mode in spark?
Why lazy evaluation is good in spark?
What are the languages supported by apache spark?
What is lazy evaluation and how is it useful?
What is the command to start and stop the Spark in an interactive shell?
What is data skew in spark?
What are shared variables?
Explain the run-time architecture of Spark?
Can you define rdd?
How is RDD in Apache Spark different from Distributed Storage Management?
What is a databricks cluster?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)