How can you launch Spark jobs inside Hadoop MapReduce?
Answer Posted / Mohammad Shadab
Spark jobs can be launched inside Hadoop MapReduce using the Spark's built-in 'spark-submit' command with suitable configurations, such as setting the Hadoop configuration properties and specifying a jar file that contains your Spark application.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers