Explain about the core components of a distributed Spark application?
Answer Posted / Jitendra Verma
The core components of a distributed Spark application are: 1) SparkContext (an entry point to the Spark cluster, handling resource allocation and job scheduling); 2) RDDs (Resilient Distributed Datasets, immutable collections of data that can be processed in parallel); 3) Transformations (methods for creating new RDDs from existing ones); 4) Actions (methods that return a value or perform side effects, triggering the execution of all transformations up to that point).
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers