What is the relation between job and task in hadoop?
What infrastructure do we need to process 100 TB data using Hadoop?
What will be the consideration while we do Hardware Planning for Master in Hadoop architecture?
What is Schema on Read and Schema on Write?
Explain how can we change the split size if our commodity hardware has less storage space?
Virtual Box & Ubuntu Installation?
Hadoop Libraries and Utilities and Miscellaneous Hadoop Applications?
Which are the two types of 'writes' in HDFS?
Why is hadoop faster?
Can we do online transactions(oltp) using hadoop?
what are the steps involved in commissioning adding
What is a checkpoint?
How a task is scheduled by a jobtracker?
What are the four characteristics of Big Data?
What is the function of ApplicationMaster?