Is it necessary to start Hadoop to run any Apache Spark Application ?
Answer / Sudam Kumar
It is not always necessary to start Hadoop to run a Spark application. While Spark was originally designed to work with Hadoop as its underlying file system and resource management framework, it can also be used independently on other cluster managers like Mesos or YARN. In such cases, Spark does not require Hadoop to be running.
| Is This Answer Correct ? | 0 Yes | 0 No |
Can you explain spark mllib?
Do we need hadoop for spark?
What is spark context spark session?
Is spark streaming real time?
What is the method to create a data frame?
What is lineage graph in Apache Spark?
Explain about the common workflow of a Spark program?
What is shuffle read and shuffle write in spark?
How is spark sql different from hql and sql?
What do you mean by Persistence?
Explain fold() operation in spark?
Is spark better than hadoop?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)