Do I need to install hadoop for spark?
Answer / Anoop Kumar Srivastava
Although Apache Spark can run on top of Hadoop YARN, it is not strictly required to have a pre-installed Hadoop cluster. You can also run Spark as a standalone application or use Mesos as the resource manager instead.
| Is This Answer Correct ? | 0 Yes | 0 No |
What is the role of Spark Driver in spark applications?
Which serialization libraries are supported in spark?
How can we create rdds in apache spark?
How do you process big data with spark?
Does spark run mapreduce?
What is Map() operation in Apache Spark?
What is catalyst framework in spark?
How do you parse data in xml? Which kind of class do you use with java to parse data?
Is there any benefit of learning MapReduce, then?
What is the difference between coalesce and repartition in spark?
What is spark tool?
Which is better hadoop or spark?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)