Answer Posted / Ravendra Singh Pundhir
No, PySpark doesn't directly install Spark. Instead, it is a separate library that runs on top of Spark. To use PySpark, you must first have Spark installed on your system.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers