How is streaming implemented in spark?
Answer / Meena Singh
Spark Streaming processes live data streams by dividing the stream into batches and processing each batch using Spark's core functionality. The micro-batches are typically configured with a fixed interval, such as 1 second or 10 minutes, depending on the requirements of the specific application.
| Is This Answer Correct ? | 0 Yes | 0 No |
explain the concept of RDD (Resilient Distributed Dataset). Also, state how you can create RDDs in Apache Spark.
Define "PageRank".
What is meant by rdd lazy evaluation?
Explain Catalyst framework?
What is paired rdd in spark?
Explain fold() operation in spark?
What are the benefits of using Spark with Apache Mesos?
Define the level of parallelism and its need in spark streaming?
What is difference between spark and scala?
What is scala and spark?
Does google use spark?
Can you explain spark core?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)