What do you understand by Transformations in Spark?
Answer / Mohit Saran
Transformations in Apache Spark are functions that take an RDD as input and return a new RDD as output. These transformations can be accumulated and executed when an action is called.
| Is This Answer Correct ? | 0 Yes | 0 No |
Explain coalesce operation in Apache Spark?
What are features of apache spark?
How do I start a spark server?
What is heap memory in spark?
List the popular use cases of Apache Spark?
Explain the Parquet File format in Apache Spark. When is it the best to choose this?
Name some companies that are already using Spark Streaming?
What happens if rdd partition is lost due to worker node failure?
What is Spark Streaming?
What is the task of Spark Engine
What are the ways to create RDDs in Apache Spark? Explain.
Which one will you choose for a project –Hadoop MapReduce or Apache Spark?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)