Does spark need hdfs?
Answer / Roushan Kumar Tiwary
Although Spark can work with HDFS (Hadoop Distributed File System), it is not a requirement. Spark can read and write data from various data sources like local file system, Cassandra, MongoDB, S3, etc.
| Is This Answer Correct ? | 0 Yes | 0 No |
Can spark work without hadoop?
Where are rdd stored?
What is cluster in apache spark?
What do you mean by Speculative execution in Apache Spark?
How is spark fault tolerance?
Explain Spark coalesce() operation?
Can you explain about the cluster manager of apache spark?
What is difference between spark and scala?
Please explain the sparse vector in Spark.
What is meant by Transformation? Give some examples.
In a very huge text file, you want to just check if a particular keyword exists. How would you do this using Spark?
Are spark dataframes immutable?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)