Skip to main content

Run machine learning jobs on AWS with a single command.

Project description

Nimbo: Run jobs on AWS with a single command

Nimbo is a CLI tool that allows you to run code on AWS as if you were running it locally. It's as simple as:

nimbo run "python -u train.py --lr=3e-4"

It also provides many useful commands to make it faster to work with AWS, such as easily checking prices, logging onto an instance, or syncing data. For example:

  • nimbo list-spot-gpu-prices
  • nimbo ssh
  • nimbo push datasets
  • nimbo pull logs
  • nimbo delete-all-instances

Nimbo drastically simplifies your AWS workflow by taking care of instance, environment, data, and IAM management - no changes to your codebase needed. Since it is independent of your code, you can run any type of job you want.

Key Features

  • Your Infrastructure: Code runs on your EC2 instances and data is stored in your S3 buckets. This means that you can easily use the resulting models and data from anywhere within your AWS organization, and use your existing permissions and credentials.
  • User Experience: Nimbo gives you the command line tools to make working with AWS as easy as working with local resources. No more complicated SDKs and never-ending documentation.
  • Customizable: Want to use a custom AMI? Just change the image ID in the Nimbo config file. Want to use a specific conda package? Just add it to your environment file. Nimbo is built with customization in mind, so you can use any setup you want.
  • Seamless Spot Instances With Nimbo, using spot instances is as simples as changing a single value on the config file. Enjoy the 70-90% savings with AWS spot instances with no changes to your workflow.
  • Managed Images We provide managed AMIs with the latest drivers, with unified naming across all regions. We will also release AMIs that come preloaded with ImageNet and other large datasets, so that you can simply spin up an instance and start training.

You can find more information at nimbo.sh, or read the docs at docs.nimbo.sh.

Getting started

Please visit the Getting started page in the docs.

Examples

Sample projects can be found at our examples repo, nimbo-examples. Current examples include:

Product roadmap

  • Implement nimbo notebook: You will be able to spin up a jupyter lab notebook running on an EC2 instance. Data will be continuously synced with your S3 bucket so that you don't have to worry about doing manual backups. Your local code will be automatically synced with the instance, so you can code locally and test the changes directly on the remote notebook. The notebook will also be synced with your local machine so you don't have to worry about losing your notebook changes when deleting the instance.
  • GCP support: Use the same commands to run jobs on AWS or GCP.
  • Deployment: Deploy ML models to AWS/GCP with a single command. Automatically create an API endpoint for providing video/audio/text and getting results from your model back.
  • Add Docker support: Right now we assume you are using a conda environment, but many people use docker to run jobs. This feature would allow you to run a command such as nimbo run "docker-compose up", where the docker image would be fetched from DockerHub (or equivalent repository) through a docker_image parameter on the nimbo-config.yml file.
  • Add AMIs with preloaded large datasets: Downloading and storing large datasets like ImageNet is a time consuming process. We will make available AMIs that come with an extra EBS volume mounted on /datasets, so that you can use large datasets without worrying about storing them or waiting for them to be fetched from your S3 bucket. Get in touch if you have datasets you would like to see preloaded with the instances.

Developing

If you want to make changes to the codebase, you can clone this repo and

  1. pip install -e . to install nimbo locally. As you make code changes, your local nimbo installation will automatically update.
  2. pip install -r requirements/dev.txt for installing all dependencies for development.

Running Tests

Create two instance keys, one for eu-west-1 and one for us-east-2. The keys should begin with the zone name, e.g. eu-west-1-dave.pem. Do not forget to chmod 400 the created keys. Place these keys in src/nimbo/tests/assets.

Create a nimbo-config.yml file in src/nimbo/tests/assets with only aws_profile, security_group, and role fields set.

Make sure that the security_group that you put in test nimbo-config.yml allows your IP for all regions, otherwise, the tests will fail.

Use pytest to run the tests

pytest -x

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nimbo-0.2.9.tar.gz (30.5 kB view hashes)

Uploaded Source

Built Distribution

nimbo-0.2.9-py3-none-any.whl (36.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page