Why is HDFS only suitable for large data sets and not the correct tool to use for many small files?
Answer Posted / Rajesh Ram
HDFS (Hadoop Distributed File System) is optimized for handling large data sets due to its distributed architecture, which allows for efficient data storage and processing across multiple nodes. However, when dealing with many small files, HDFS can become less efficient because of the overhead associated with managing a large number of small files, such as the need for multiple blocks per file and increased metadata management.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
No New Questions to Answer in this Category !! You can
Post New Questions
Answer Questions in Different Category