Wednesday 8 July 2015

What are the core modules of Hadoop?

Section 2: What are the core modules of Hadoop?

There are 4 core modules of Hadoop. They are:

  1. Hadoop Common: The common utilities that support the other Hadoop modules.
  2. Hadoop Distributed File System (HDFS): A distributed file system that provides high-throughput access to application data.
  3. Hadoop YARN: A framework for job scheduling and cluster resource management.
  4. Hadoop MapReduce: A YARN-based system for parallel processing of large data sets.

No comments:

Post a Comment