What is the difference between spark and scala?
Answer / Ruchi Dhawan
Apache Spark is a big data processing framework, while Scala is a high-level programming language. Spark can be written in several languages including Scala, but it is not itself Scala. Scala is one of the programming languages used to write code for Spark applications.
| Is This Answer Correct ? | 0 Yes | 0 No |
Do we need hadoop for spark?
Compare MapReduce and Spark?
Why do we use persist () on links rdd?
Can you run spark without hadoop?
What are the different ways of representing data in Spark?
What is Map() operation in Apache Spark?
How will you connect Apache Spark with Apache Mesos?
How does lazy evaluation work in spark?
Explain Machine Learning library in Spark?
How is RDD in Spark different from Distributed Storage Management?
What is flatmap?
Is it necessary to start Hadoop to run any Apache Spark Application ?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)