Answer Posted / Athar Shakeel
SparkContext (often abbreviated as SC) is the entry point for interacting with a Spark cluster. It creates and manages an ApplicationMaster process, provides access to Spark's internal scheduler, and allows the user to create RDDs, submit jobs, and manage configuration settings.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers