Answer Posted / Mohd. Jamshaid Khan
RDDs in Apache Spark can be created in several ways. One common method is to load data from an external storage system such as HDFS, Cassandra, or a local file system using various APIs provided by Spark. Another way is to create RDDs programmatically by transforming existing RDDs using operators like map, filter, and join. Additionally, Spark Streaming allows the creation of RDDs from live data streams.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers