Skip to main content

Backend.AI Agent

Project description

Backend.AI Agent

The Backend.AI Agent is a small daemon that does:

  • Reports the status and available resource slots of a worker to the manager
  • Routes code execution requests to the designated kernel container
  • Manages the lifecycle of kernel containers (create/monitor/destroy them)

Package Structure

  • ai.backend
    • agent: The agent package
      • server: The agent daemon which communicates with the manager and the Docker daemon
      • watcher: A side-by-side daemon which provides a separate HTTP endpoint for accessing the status information of the agent daemon and manipulation of the agent's systemd service


Please visit the installation guides.

Kernel/system configuration

Recommended kernel parameters in the bootloader (e.g., Grub):

cgroup_enable=memory swapaccount=1

Recommended resource limits:


root hard nofile 512000
root soft nofile 512000
root hard nproc 65536
root soft nproc 65536
user hard nofile 512000
user soft nofile 512000
user hard nproc 65536
user soft nproc 65536


net.ipv4.ip_local_port_range="40000 65000"
net.ipv4.tcp_rmem=4096 12582912 16777216
net.ipv4.tcp_wmem=4096 12582912 16777216

The ip_local_port_range should not overlap with the container port range pool (default: 30000 to 31000).

For development


  • libsnappy-dev or snappy-devel system package depending on your distro
  • Python 3.6 or higher with pyenv and pyenv-virtualenv (optional but recommneded)
  • Docker 18.03 or later with docker-compose (18.09 or later is recommended)

First, you need a working manager installation. For the detailed instructions on installing the manager, please refer the manager's README and come back here again.

Preparing working copy

Install and activate git-lfs to work with pre-built binaries in src/ai/backend/runner.

$ git lfs install

Next, prepare the source clone of the agent and install from it as follows. pyenv is just a recommendation; you may use other virtualenv management tools.

$ git clone agent
$ cd agent
$ pyenv virtualenv venv-agent
$ pyenv local venv-agent
$ pip install -U pip setuptools
$ pip install -U -r requirements-dev.txt


We use flake8 and mypy to statically check our code styles and type consistency. Enable those linters in your favorite IDE or editor.

Halfstack (single-node development & testing)

With the halfstack, you can run the agent simply. Note that you need a working manager running with the halfstack already!

Recommended directory structure

Install as an editable package in the agent (and the manager) virtualenvs to keep the codebase up-to-date.

$ cd agent
$ pip install -U -e ../common


$ mkdir -p "./scratches"
$ cp config/halfstack.toml ./agent.toml

Then, run it (for debugging, append a --debug flag):

$ python -m ai.backend.agent.server

To run the agent-watcher:

$ python -m ai.backend.agent.watcher

The watcher shares the same configuration TOML file with the agent. Note that the watcher is only meaningful if the agent is installed as a systemd service named backendai-agent.service.

To run tests:

$ python -m flake8 src tests
$ python -m pytest -m 'not integration' tests



Put a TOML-formatted agent configuration (see the sample in config/sample.toml) in one of the following locations:

  • agent.toml (current working directory)
  • ~/.config/ (user-config directory)
  • /etc/ (system-config directory)

Only the first found one is used by the daemon.

The agent reads most other configurations from the etcd v3 server where the cluster administrator or the Backend.AI manager stores all the necessary settings.

The etcd address and namespace must match with the manager to make the agent paired and activated. By specifying distinguished namespaces, you may share a single etcd cluster with multiple separate Backend.AI clusters.

By default the agent uses /var/cache/scratches directory for making temporary home directories used by kernel containers (the /home/work volume mounted in containers). Note that the directory must exist in prior and the agent-running user must have ownership of it. You can change the location by scratch-root option in agent.toml.

Running from a command line

The minimal command to execute:

python -m ai.backend.agent.server
python -m ai.backend.agent.watcher

For more arguments and options, run the command with --help option.

Example config for systemd


Description=Backend.AI Agent
Requires=docker.service docker.service




#! /bin/sh
if [ -z "$PYENV_ROOT" ]; then
  export PYENV_ROOT="$HOME/.pyenv"
  export PATH="$PYENV_ROOT/bin:$PATH"
eval "$(pyenv init -)"
eval "$(pyenv virtualenv-init -)"

cd /home/user/
if [ "$#" -eq 0 ]; then
  exec python -m ai.backend.agent.server
  exec "$@"


The manager and agent should run in the same local network or different networks reachable via VPNs, whereas the manager's API service must be exposed to the public network or another private network that users have access to.

The manager must be able to access TCP ports 6001, 6009, and 30000 to 31000 of the agents in default configurations. You can of course change those port numbers and ranges in the configuration.

Manager-to-Agent TCP Ports Usage
6001 ZeroMQ-based RPC calls from managers to agents
6009 HTTP watcher API
30000-31000 Port pool for in-container services

The operation of agent itself does not require both incoming/outgoing access to the public Internet, but if the user's computation programs need the Internet, the docker containers should be able to access the public Internet (maybe via some corporate firewalls).

Agent-to-X TCP Ports Usage
manager:5002 ZeroMQ-based event push from agents to the manager
etcd:2379 etcd API access
redis:6379 Redis API access
docker-registry:{80,443} HTTP watcher API
(Other hosts) Depending on user program requirements

Project details

Release history Release notifications

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for, version 19.9.5
Filename, size File type Python version Upload date Hashes
Filename, size backend.ai_agent-19.9.5-py3-none-any.whl (20.9 MB) File type Wheel Python version py3 Upload date Hashes View hashes
Filename, size (19.1 MB) File type Source Python version None Upload date Hashes View hashes

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page