Apache Commons is one of the easiest things to explain in the Hadoop context – even though it might get complicated when working with it. Apache Commons is a collection of libraries and tools that are often necessary when working with Hadoop. These libraries and tools are then used by various projects in the Hadoop ecosystem. Samples include:
- A CLI minicluster, that enables a single-node Hadoop installation for testing purposes
- Native libraries for Hadoop
- Authentification and superusers
- A Hadoop secure mode
You might not use all of these tools and libraries that are in Hadoop Commons as some of them are only used when you work on Hadoop projects.
Leave a Reply
Want to join the discussion?Feel free to contribute!