Answer Posted / Shivram Yadav
Yes, in Apache Spark, it is possible to broadcast an Resilient Distributed Dataset (RDD) using the `broadcast()` function. Broadcasting an RDD distributes the same data across all worker nodes in the cluster, allowing for efficient sharing of large datasets between tasks.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers