Is hadoop mandatory for spark?
Answer / Ajeet Pratap Singh
No, Hadoop is not strictly required for Apache Spark. However, it's common to use Spark with Hadoop because of their synergy, as Spark can leverage HDFS storage and YARN resource management. But, Spark can also run on standalone mode or Mesos.
| Is This Answer Correct ? | 0 Yes | 0 No |
Can you list down the limitations of using Apache Spark?
Explain the filter transformation?
What is Spark MLlib?
Do you need to install Spark on all nodes of Yarn cluster while running Spark on Yarn?
What is write ahead log(journaling)?
Can you explain spark rdd?
What are the components of spark?
Explain about the core components of a distributed Spark application?
What is Spark SQL?
What is spark certification?
Name various types of Cluster Managers in Spark.
List various commonly used machine learning algorithm?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)