What is the default partition in spark?
Answer / Ajay Singh
The default number of partitions in Spark is 200, as specified by the spark.default.parallelism configuration property.
| Is This Answer Correct ? | 0 Yes | 0 No |
Explain foreach() operation in apache spark?
What is spark shuffle?
How can Spark be connected to Apache Mesos?
What is setmaster in spark?
Explain the use of File system API in Apache Spark
Can spark work without hadoop?
Is spark used for machine learning?
What are the key features of Apache Spark that you like?
Is scala required for spark?
What is difference between coalesce and repartition?
Why is spark popular?
How to save RDD?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)