Skip to main content

Combine DO droplets with your ssh configuration

Project description

Digital ocean -> ssh config

Depends on https://github.com/koalalorenzo/python-digitalocean which can be installed with pip3 install -U python-digitalocean

This python 3 script will help you keep your ssh config in sync with your digital ocean droplets

$ python3 do_to_ssh_config.py production

· Reading /home/alex/.config/do_to_ssh_config/production.json
· Parsing /home/alex/.ssh/config
· Fetching droplets from DO
· Writing into your ssh config file

✓ Done, 11 droplets synced

Features

  • Supports different ssh keys for each droplet, depending on the DO tags of the droplet
  • Works with different configurations and can write in different sections of your ssh config

How to

Step 1: Create the json configuration file

Save this at ~/.config/do_to_ssh_config/<name>.json, where <name> is how you want to call it, e.g. production or testing or anything else. For this example I will use production.

{
    "token": "DIGITAL_OCEAN_READ_ONLY_TOKEN_HERE",
    "keys": {
        "tagToKey": {
        },
        "default": {
            "key": "common",
            "priority": 0
        }
    },
    "startMark": "# DO production",
    "endMark": "# /DO production",
    "hostPrefix": "do-prod-"
}

Note: This is the simplest possible configuration file that uses the same key for every droplet and the droplet name as Host, for more options, read on.

  1. Generate a new personal DO API read-only access token here
  2. hostPrefix is what prefix to add in the Host key in your ssh config for each droplet loaded through this configuration, can be anything you want

Step 2: Add the 2 marks in your ssh config

The above json configuration contains the startMark and endMark. These should be somewhere inside your ssh configuration and can be whatever you want (start with # for ssh config comments, though):

# DO production
# /DO production

Between these 2 marks the script will delete everything and add the new entries. Be careful not to add your own hosts between these 2 marks.

Step 3: Run the script

$ python3 do_to_ssh_config.py production

· Reading /home/alex/.config/do_to_ssh_config/production.json
· Parsing /home/alex/.ssh/config
· Fetching droplets from DO
· Writing into your ssh config file

✓ Done, 11 droplets synced

Now your ssh config will look like this:

# DO production
Host do-prod-control-center1517024146
    # control-center1517024146
    Hostname X.X.X.X
    IdentityFile ~/.ssh/common
    User user
Host do-prod-control-center1517027030
    # control-center1517027030
    Hostname X.X.X.X
    IdentityFile ~/.ssh/common
    User user
... 9 more entries
# /DO production

If you have autogenerated ugly Host names derived from the droplet names, you can make it work with the droplet tags instead; read on.

I want to use a different ssh key, not common!

  • Change the keys.default.key setting

I want to use a different ssh key per droplet tag!

  • Change the keys.tagToKey setting and add in it entries like:
"control-center": {
    "key": "cc_prv",
    "priority": 7
},
"consul-server": {
    "key": "cs_prv",
    "priority": 6
},
"postgres-master": {
    "key": "common",
    "priority": 5
}

The final config will look like this:

{
    "token": "DIGITAL_OCEAN_READ_ONLY_TOKEN_HERE",
    "keys": {
        "tagToKey": {
            "control-center": {
                "key": "cc_prv",
                "priority": 7
            },
            "consul-server": {
                "key": "cs_prv",
                "priority": 6
            },
            "postgres-master": {
                "key": "common",
                "priority": 5
            }
        },
        "default": {
            "key": "common",
            "priority": 0
        }
    },
    "startMark": "# DO production",
    "endMark": "# /DO production",
    "hostPrefix": "do-prod-"
}

Important: A droplet can have more than 1 tag, that's why there's a field called priority there. In the above example, if a droplet has both the control-center and consul-server tags, it will use the key with the higher priority (here control-center). If a droplet has no tags or its tags do not appear in tagToKey, it will use the default key.

For the droplets that match a specific tag, now the Host in the ssh config will have the name of the tag, not the droplet name:

# DO production
Host do-prod-control-center
    # control-center1517024146
    Hostname X.X.X.X
    IdentityFile ~/.ssh/cc_prv
    User user
Host do-prod-control-center2
    # control-center1517027030
    Hostname X.X.X.X
    IdentityFile ~/.ssh/cc_prv
    User user
... more entries
# /DO production

This is convenient for large environments where the droplet names are autogenerated

Note: The droplet name is still visible as a comment in the first line of each entry

Note: As shown in the above example, if 2 or more droplets share the same tag, an ascending number is appended to the Host value.

Now you can see everything easily using ssh's tab completion, and connect anywhere:

$ ssh do-prod- <hit TAB key twice>

do-prod-control-center   do-prod-mongodb  do-prod-load-balancer    do-prod-nodejs2          do-prod-postgres-slave   do-prod-blog
do-prod-control-center2  do-prod-landing-page     do-prod-nodejs           do-prod-postgres-master  do-prod-redis            

I have production and testing and I work in 10 different companies!

Simply create different configuration files under ~/.config/do_to_ssh_config/, one for each use case of yours, like production.json and testing.json. It will be useful to have a different hostPrefix for each use case.

Also, add the different markings in your ssh config file, e.g.:

# DO production
# /DO production

# DO testing
# /DO testing

Now if you run

$ python3 do_to_ssh_config.py production

it will go on and read from production.json and write in the corresponding marking inside your ssh config. And if you run

$ python3 do_to_ssh_config.py testing

it will go on and read from testing.json and write in the corresponding marking.

Can I safely re-run the script any times I want?

Yes, provided that you haven't included any entries of yours between the markings you've specified in the configuration. Everything between the markings is deleted each time the script runs.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

do_to_ssh_config-0.0.4.tar.gz (5.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

do_to_ssh_config-0.0.4-py3-none-any.whl (5.4 kB view details)

Uploaded Python 3

File details

Details for the file do_to_ssh_config-0.0.4.tar.gz.

File metadata

  • Download URL: do_to_ssh_config-0.0.4.tar.gz
  • Upload date:
  • Size: 5.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.2.0 requests-toolbelt/0.8.0 tqdm/4.25.0 CPython/3.6.5

File hashes

Hashes for do_to_ssh_config-0.0.4.tar.gz
Algorithm Hash digest
SHA256 669dda332a88fd7d87383f64a8997a7c3b9f758b8394157415ae3814408f4c6d
MD5 3aa33a94b39fd06f5b406a31f17b4bd1
BLAKE2b-256 d119781e1440c00d97a22e1b4c9e0ce09fed1a8741ba46b6d38cc887fe86c7ed

See more details on using hashes here.

File details

Details for the file do_to_ssh_config-0.0.4-py3-none-any.whl.

File metadata

  • Download URL: do_to_ssh_config-0.0.4-py3-none-any.whl
  • Upload date:
  • Size: 5.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.2.0 requests-toolbelt/0.8.0 tqdm/4.25.0 CPython/3.6.5

File hashes

Hashes for do_to_ssh_config-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 0bc87b502e455eb28c4c53c3d2c1d3b6ea2d4d8a871679c1a0b2fe9ce2af029f
MD5 e029744952640ef9f46ea0d3cee19a74
BLAKE2b-256 0d0c577a5c1cb31f2a12e0961c7df09087532c4b914334d40de434c6e0d8b9d7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page