How many ways we can create rdd in spark?
Answer / Shashank Agarwal
In Apache Spark, there are multiple ways to create an RDD. The primary methods include: text file (using TextFile), sequence (using ParallelCollection), parallelizing collections from Scala or Java, and programmatically creating RDDs using transformations.
| Is This Answer Correct ? | 0 Yes | 0 No |
What is transformation in spark?
Explain the lookup() operation in Spark?
Explain the operation reduce() in Spark?
What is faster than apache spark?
Does spark use yarn?
What is spark vs hadoop?
Name various types of Cluster Managers in Spark.
What is spark configuration?
Describe Partition and Partitioner in Apache Spark?
What are the file formats supported by spark?
What is Immutable?
What is serialization in spark?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)