What happens when you submit spark job?
Answer / Parvez Khan
When you submit a Spark job, the job is divided into multiple tasks and executed in parallel across a cluster of machines. Each task operates on a subset of the data and produces intermediate results that are combined to produce the final result.
| Is This Answer Correct ? | 0 Yes | 0 No |
Define the common faults of the developer while using apache spark?
If there is certain data that we want to use again and again in different transformations, what should improve the performance?
What are spark jobs?
What is difference between spark and scala?
Define the level of parallelism and its need in spark streaming?
What is write ahead log(journaling) in Spark?
What is coarsegrainedexecutorbackend?
Is spark difficult to learn?
Please explain the sparse vector in Spark.
What is the difference between spark and scala?
Can you explain apache spark?
What happens when an action is executed in spark?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)