Explain how can apache spark be used alongside hadoop?
Answer Posted / Ravi Kumar
Apache Spark can be used alongside Hadoop by leveraging YARN (Yet Another Resource Manager) as the cluster manager. This integration allows Spark applications to run within the existing Hadoop ecosystem, utilizing HDFS for storage and MapReduce tasks for data processing when needed.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers