Does spark require hadoop?
Answer / Ranjeet Kumar Chanchal
While Spark can run on top of Hadoop for resource management (YARN), it's not strictly required. Spark can also be run standalone or with other cluster managers like Mesos. However, having a Hadoop installation can provide additional benefits such as access to HDFS storage.
| Is This Answer Correct ? | 0 Yes | 0 No |
What is executor memory in spark?
Explain the RDD properties?
Can you explain spark streaming?
Is it possible to use Apache Spark for accessing and analyzing data stored in Cassandra databases?
What is number of executors in spark?
What are the functions of "Spark Core"?
What is spark reducebykey?
What is tungsten engine in spark?
What is partitioner spark?
Explain apache spark streaming? How is the processing of streaming data achieved in apache spark?
Is there a module to implement sql in spark? How does it work?
Which one will you choose for a project –Hadoop MapReduce or Apache Spark?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)