What are the disadvantages of using Spark?
Answer / Matendra Singh Manu
Some disadvantages of using Apache Spark include: 1) High memory consumption due to in-memory data storage. 2) Scaling can be challenging for very large clusters due to network overhead. 3) Complexity and resource usage during iterative algorithms.
| Is This Answer Correct ? | 0 Yes | 0 No |
When running Spark applications, is it necessary to install Spark on all the nodes of YARN cluster?
What do you understand by worker node?
What do you know about transformations in spark?
Why do fires spark?
Do I need to learn scala for spark?
What is spark dynamic allocation?
Is it possible to run Apache Spark on Apache Mesos?
how will you implement SQL in Spark?
Can you explain benefits of spark over mapreduce?
State the difference between persist() and cache() functions.
How does lazy evaluation work in spark?
What is spark vectorization?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)