Answer Posted / Abhishek Kumar Vaishnav
Spark Core provides a set of fundamental functionality that powers the Apache Spark ecosystem. These functions include: 1) Data Storage: Provides distributed storage for RDDs (Resilient Distributed Datasets). 2) Distributed Execution: Manages the execution of tasks across a cluster of machines. 3) API Libraries: Offers APIs in Scala, Java, Python, and R for various big data processing tasks like SQL, streaming, machine learning, and graph processing.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers