Golgappa.net | Golgappa.org | BagIndia.net | BodyIndia.Com | CabIndia.net | CarsBikes.net | CarsBikes.org | CashIndia.net | ConsumerIndia.net | CookingIndia.net | DataIndia.net | DealIndia.net | EmailIndia.net | FirstTablet.com | FirstTourist.com | ForsaleIndia.net | IndiaBody.Com | IndiaCab.net | IndiaCash.net | IndiaModel.net | KidForum.net | OfficeIndia.net | PaysIndia.com | RestaurantIndia.net | RestaurantsIndia.net | SaleForum.net | SellForum.net | SoldIndia.com | StarIndia.net | TomatoCab.com | TomatoCabs.com | TownIndia.com
Interested to Buy Any Domain ? << Click Here >> for more details...


How can you trigger automatic clean-ups in Spark to handle accumulated metadata?



How can you trigger automatic clean-ups in Spark to handle accumulated metadata?..

Answer / Mukesh Kumar Gangwar

To trigger automatic clean-ups in Apache Spark, you can use the "spark.sql.execution.arrow.enableHadoopMetadataCleaner" configuration property. This property enables a metadata cleaner that cleans up Hadoop Distributed File System (HDFS) blocks when they are no longer needed. By default, it is set to false.

Is This Answer Correct ?    0 Yes 0 No

Post New Answer

More Apache Spark Interview Questions

What is spark in big data?

1 Answers  


Why is spark so fast?

1 Answers  


What is sparkcontext in spark?

1 Answers  


Compare MapReduce and Spark?

1 Answers  


Can you explain spark streaming?

1 Answers  


What are the features and characteristics of Apache Spark?

1 Answers  


Do you need to install Spark on all nodes of Yarn cluster while running Spark on Yarn?

1 Answers  


Explain first() operation in Apache Spark RDD?

1 Answers  


Is apache spark a programming language?

1 Answers  


What is the task of Spark Engine

1 Answers  


What is a tuple in spark?

1 Answers  


How rdd can be created in spark?

1 Answers  


Categories