Skip to main content

Docker Swarm One-Shot Service Runner

Project description

Swarmer

Build Status

Python application and API to run services from within a docker swarm.

How it works

The swarmer lives in a service inside a docker swarm. Once exposed, it offers an API to activate one shot docker service runs. There is a companion application to this service that is responsible for reporting the results back to this service. Once all tasks within a job are complete, the complete set of results is posted back to a specified callback URL.

Clients

While any client that is compatible of running your subject code from within another container based on the values passed will work, there is a default client you can view. This list will be updated when more default clients become available:

Dependencies

To run, this image requires a redis service to be available, and to receive results, you'll need a callback url accepting POST data (application/json) that is accessible from the your swarm location.

Getting started

You can take the compose example in this repository and run it as it is in your docker swarm via docker stack deploy -c docker-compose.yml, changing any of the values that you see fit.

Once started, there will be a service exposed at the address of your swarm that you can post jobs to.

Making your own image

If you're building your own image using this application, you can simply pip install swarmer to get it in there. Then just expose your desired ports and run swarmer as the entry point.

The initial request

When you want to submit a new job, you send a request to the /submit endpoint, with a content type of application/json and a body with the following:

{
  "image_name": "some-image:latest",
  "callback_url": "your postback url",
  "tasks": [
    {
      "task_name": "<Name>",
      "task_args": ["arg-one", "arg-two", ...]
    },
    ...
  ]
}

You will receive a response with an identifier, this is a unique job id you can use to check on the status of your job

Checking the status of a job

If you have a running job that you would like to check on, you can send a GET request to the /status/<identifier> resource, where identifier is the id value of your job.

Getting your results

Once all the tasks for your job are complete, the URL you specified in the callback_url field will receive a POST request with the collected results. The general format is:

{
  "__image": "your image",
  "__callback_url": "your url",
  "tasks": [
    {
      "name": "task name",
      "status": 0,
      "args": ["your", "args"],
      "result": {
        "stdout": "the output written to stdout",
        "stderr": "the output written to stderr"
      }
    },
    ...
  ]
}

For each task, the status field represents the exit status of the task process, while the result object contains the output that your task wrote to the two output streams.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

swarmer-0.6.0rc1.tar.gz (16.1 kB view details)

Uploaded Source

Built Distribution

swarmer-0.6.0rc1-py3-none-any.whl (20.8 kB view details)

Uploaded Python 3

File details

Details for the file swarmer-0.6.0rc1.tar.gz.

File metadata

  • Download URL: swarmer-0.6.0rc1.tar.gz
  • Upload date:
  • Size: 16.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.20.0 setuptools/40.6.2 requests-toolbelt/0.8.0 tqdm/4.28.1 CPython/3.7.0a4+

File hashes

Hashes for swarmer-0.6.0rc1.tar.gz
Algorithm Hash digest
SHA256 f47db6961a432f84d2230a7facb3de6254db512bb4443bd9f4b17dc58222cb6d
MD5 ac42658a57850e66af1fc8014cf590b6
BLAKE2b-256 e87a486536af053f5a87d0715864f2df6fa14263975db6c94a3b7a78bc67e078

See more details on using hashes here.

File details

Details for the file swarmer-0.6.0rc1-py3-none-any.whl.

File metadata

  • Download URL: swarmer-0.6.0rc1-py3-none-any.whl
  • Upload date:
  • Size: 20.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.20.0 setuptools/40.6.2 requests-toolbelt/0.8.0 tqdm/4.28.1 CPython/3.7.0a4+

File hashes

Hashes for swarmer-0.6.0rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 2e21406689afa8a01b942d622e200ad708e5204a44d5211fc8eac97160d21e8d
MD5 6e47477e3a882af7f25f934371414e7a
BLAKE2b-256 d151bf6c52958fd7f8e19ad1e900e6ba90efce1f83b6154c81e69dd3330c84fc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page