What is difference between scala and spark?
Answer / Dwijendra Kumar Upadhyay
Scala is a general-purpose programming language developed by Martin Odersky, while Spark is an open-source big data processing framework. Scala can be used to write programs for Spark, but it's not necessary.
| Is This Answer Correct ? | 0 Yes | 0 No |
What are the various functions of Spark Core?
Can you explain spark mllib?
Explain various level of persistence in Apache Spark?
What is spark mapvalues?
Can you list down the limitations of using Apache Spark?
What is sc parallelize in spark?
What do you use spark for?
How can data transfer be minimized when working with Apache Spark?
What happens when we submit a spark job?
Do I need to learn scala for spark?
Explain catalyst query optimizer in Apache Spark?
What can I do with my m&s sparks points?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)