What do you know about transformations in spark?
Answer / Ashutosh Verma
"In Apache Spark, a transformation is an operation that takes an RDD and converts it into another RDD. Transformations are lazy - they do not actually execute the RDD until an action (like count(), collect(), or saveAsTextFile()) is called. Some common transformations include map(), filter(), flatMap(), reduceByKey(), and join().".
| Is This Answer Correct ? | 0 Yes | 0 No |
What is flatmap in angular?
What is spark technology?
What is meant by Transformation? Give some examples.
What is Apache Spark? What is the reason behind the evolution of this framework?
What is spark reducebykey?
What is the difference between DSM and RDD?
What is a worker node in Apache Spark?
Is spark used for machine learning?
What is a dataset? What are its advantages over dataframe and rdd?
Can you explain broadcast variables?
What are the advantages of datasets in spark?
What is Directed Acyclic Graph(DAG)?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)