Skip to main content

No project description provided

Project description

Documentation is available at Read the Docs

Build Status

Goal

HOD is a set of scripts to start services, for example a Hadoop cluster, from within another resource management system (i.e. Torque/PBS). As such, it allows traditional users of HPC systems to experiment with Hadoop or use it as a production setup if there is no dedicated setup available.

Hadoop is not the only software supported. HOD can also create HBase databases, IPython notebooks, and set up a Spark environment.

Benefits

There are two main benefits:

  1. Users can run jobs on a traditional batch cluster. This is good for small to medium Hadoop jobs where the framework is used but having a ‘big data’ cluster isn’t required. At this point the performance benefits of a parallel file system outweigh the ‘share nothing’ architecture of a HDFS style file system.

  2. Users from different groups can run whichever version of Hadoop they like. This removes the need for painful upgrades to running Yarn clusters and hoping all users’ jobs are backwards compatible.

History

Hadoop used to ship it’s own HOD (Hadoop On Demand) but it was not maintained and only supported Hadoop without tuning. The HOD code that was shipped with Hadoop 1.0.0 release was buggy to say the least. An attempt was made to make it work on the UGent HPC infrastructure, and although a working Hadoop cluster was realised, it was a nightmare to extend it’s functionality. At that point (April 2012), hanythingondemand was started to be better maintainable and support more tuning and functionality out of the box. For example, HBase was a minimum requirement. Hence, why Hadoop on Demand became ‘Hanything’. Apart from the acronym ‘HOD’ nothing of Hadoop On Demand was reused.

More on the history of Hadoop On Demand can be found in section 2 of this paper on Yarn (PDF)

How does it work?

hanythingondemand works by launching an MPI job which uses the reserved nodes as a cluster-in-a-cluster. These nodes then have the various Hadoop services started on them. Users can launch a job at startup (batch mode) or login to worker nodes (using the hod connect command) where they can interact with their services.

Prerequisites

The rest of the requirements can be installed using EasyBuild:

  • Python and various libraries.

  • mpi4py

  • eg. on fedora yum install -y mpi4py-mpich2

  • If you build this yourself, you will probably need to set the $MPICC environment variable.

  • vsc-base - Used for command line parsing.

  • vsc-mympirun - Used for setting up the MPI job.

  • pbs_python - Used for interacting with the PBS (aka Torque) server.

  • netifaces

  • netaddr

  • Java

  • Oracle JDK or OpenJDK - both installable with Easybuild

  • Hadoop binaries

  • eg. the Cloudera distribution versions (used to test HOD)

Example use cases:

Creating an HOD cluster:

# submits a job to start a Hadoop cluster on 16 nodes
$ hod create --dist Hadoop-2.3.0-cdh5.0 -n16 --label my-cluster

### Connect to your new cluster.
$ hod connect my-cluster

### Then, in your session, you can run your hadoop jobs:
$ hadoop jar somejob.jar SomeClass arg1 arg2

‘Set it and forget it’ batch jobs:

# Run a batch job on 1 node:
$ hod batch --dist Hadoop-2.3.0-cdh5.0 --label my-cluster --script=my-script.sh

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hanythingondemand-3.2.3.tar.gz (50.5 kB view details)

Uploaded Source

File details

Details for the file hanythingondemand-3.2.3.tar.gz.

File metadata

  • Download URL: hanythingondemand-3.2.3.tar.gz
  • Upload date:
  • Size: 50.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Python-urllib/2.7

File hashes

Hashes for hanythingondemand-3.2.3.tar.gz
Algorithm Hash digest
SHA256 e8bb176df45546e47e6abdfcb9cfd16f73369362368286cfe69d270710323c58
MD5 95fde884e1e0acd95961d01bd4e0327b
BLAKE2b-256 d155a23e364457529eb09e27f1761123b8ec7fa4bfe1a21ae7f35ba607db26fc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page