What is apache spark architecture?
Answer / Rao Yaduman
Apache Spark's architecture consists of the Spark Master (or Spark Context), Spark Workers, and Resilient Distributed Datasets (RDDs). The Master manages resources, schedules tasks, and assigns them to workers. Workers execute tasks and manage RDD partitions.
| Is This Answer Correct ? | 0 Yes | 0 No |
What is apache spark written in?
Why we need compression and what are the different compression format supported?
What is executor memory and driver memory in spark?
What is the difference between spark ml and spark mllib?
What is difference between dataset and dataframe?
What is lazy evaluation and how is it useful?
What are the downsides of Spark?
What are the various types of shared variable in apache spark?
What are the roles of the file system in any framework?
Does spark use java?
What is apache spark good for?
What operations RDD support?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)