Answer Posted / Srijan Chaubey
"Filter Transformation in Apache Spark is used to create a new dataset that only contains elements from the input dataset which satisfy a specified condition. It applies the provided function to each element and returns a new RDD containing only those elements for which the function evaluates to true."
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers