What is spark context spark session?
Answer / Rajnish Kumar Mahiwal
Yes, Apache Spark was initially designed to run on Hadoop Distributed File System (HDFS), but it can also run on other data processing frameworks like YARN or Mesos.
| Is This Answer Correct ? | 0 Yes | 0 No |
Can you explain about the cluster manager of apache spark?
Does spark use java?
What do you know about schemardd?
Explain the level of parallelism in spark streaming?
Explain the default level of parallelism in Apache Spark
What is lazy evaluation and how is it useful?
How do I download and install spark?
How does broadcast join work in spark?
What is setmaster in spark?
How is spark sql different from hql and sql?
How to create RDD?
What is a reliable and unreliable receiver in Spark?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)