Why is Transformation lazy in Spark?
Answer / Romi Awasthi
Transformations are lazy in Spark to optimize resource utilization. When a transformation is called, it doesn't immediately execute the action on all data; instead, it creates a logical plan outlining the steps required to process the data. The physical execution starts only when an action like collect(), count(), or save() is triggered.
| Is This Answer Correct ? | 0 Yes | 0 No |
What is spark application?
Why do we need rdd in spark?
When to use spark sql?
What is spark executor cores?
What is rdd map?
What is the use of rdd in spark?
What are the various storages from which Spark can read data?
What is the difference between map and flatmap?
Define a worker node?
What is serialization in spark?
What port does spark use?
Name three companies which is used Spark Streaming services
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)