Section 2: What are the core modules of Hadoop?
There are 4 core modules of Hadoop. They are:
There are 4 core modules of Hadoop. They are:
- Hadoop Common: The common utilities that support the other Hadoop modules.
- Hadoop Distributed File System (HDFS): A distributed file system that provides high-throughput access to application data.
- Hadoop YARN: A framework for job scheduling and cluster resource management.
- Hadoop MapReduce: A YARN-based system for parallel processing of large data sets.