Answer Posted / Sandip Kumar Srivastava
HDFS (Hadoop Distributed File System) is needed for big data processing because it allows for the storage and processing of large datasets across multiple nodes. Its fault tolerance, scalability, and high-throughput capabilities make it ideal for handling the complex requirements of big data.
| Is This Answer Correct ? | 0 Yes | 0 No |
Post New Answer View All Answers
No New Questions to Answer in this Category !! You can
Post New Questions
Answer Questions in Different Category