What is sc textfile?
Answer / Anupriya Ekka
In Apache Spark, SparkContext (sc) creates a new Spark Context object and sc.textFile() reads a file as an RDD (Resilient Distributed Dataset) of lines of text.
| Is This Answer Correct ? | 0 Yes | 0 No |
What does apache spark do?
Explain the concept of resilient distributed dataset (rdd).
What is sparkconf spark?
What is a dataframe spark?
Name the languages which are supported by apache spark and which one is most popular?
Are spark dataframes distributed?
Explain fold() operation in spark?
Why scala is used in spark?
In a given spark program, how will you identify whether a given operation is Transformation or Action ?
Explain reduceByKey() Spark operation?
What is stage and task in spark?
Why is there a need for broadcast variables when working with Apache Spark?
Apache Hadoop (394)
MapReduce (354)
Apache Hive (345)
Apache Pig (225)
Apache Spark (991)
Apache HBase (164)
Apache Flume (95)
Apache Impala (72)
Apache Cassandra (392)
Apache Mahout (35)
Apache Sqoop (82)
Apache ZooKeeper (65)
Apache Ambari (93)
Apache HCatalog (34)
Apache HDFS Hadoop Distributed File System (214)
Apache Kafka (189)
Apache Avro (26)
Apache Presto (15)
Apache Tajo (26)
Hadoop General (407)