What are the disadvantages of using Apache Spark over Hadoop MapReduce?
Answer / Mandhata Ram
Although Apache Spark is more powerful and faster than Hadoop MapReduce, it does have some disadvantages. One major disadvantage is that Spark has higher resource requirements for memory and CPU compared to MapReduce. Another disadvantage is that Spark's dynamic scheduling can lead to inefficiencies when running long-running jobs due to recomputation of intermediate results. Lastly, Spark's fault tolerance mechanisms consume more resources than those used by MapReduce.
| Is This Answer Correct ? | 0 Yes | 0 No |
How does apache spark engine work?
What is the difference between reducebykey and groupbykey?
What is lambda architecture spark?
Why scala is used in spark?
What do you understand by SchemaRDD?
Where is rdd stored?
What is spark vs scala?
What is the difference between spark ml and spark mllib?
What do you mean by Persistence?
Define Spark Streaming.
What is spark mapvalues?
What is map in apache spark?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)