Answer Posted / Sharad Kumar
"The Spark engine is responsible for executing tasks, managing memory, and scheduling work across multiple nodes in a cluster. It supports various task schedulers like FIFO, fair, and Spark's default scheduler that aims to minimize job completion time."n
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers