Skip to main content

Python library to expose S3 as vault to store encrypted data

Project description

https://img.shields.io/pypi/v/s3vaultlib.svg https://travis-ci.org/gchiesa/s3vaultlib.svg?branch=master https://api.codacy.com/project/badge/Grade/902b192986194c1c9ec3f385e4db31c0 Documentation Status Updates

S3Vaultlib is a Python Library that implements a Vault on S3 bucket by using IAM roles, KMS Keys, Bucket Policies and Server Side Encryption

Features

  • Use Server Side Encryption to store the objects on S3
  • Integrates IAM Roles, KMS Keys and Cloudformation generation to setup and configure / update the Vault
  • Save, retrieve, update objects in the vault
  • Implement an abstraction to use the vault objects as Jinja variables in templates (the retrieving and processing of object as variables / dict to use in Jinja is completely transparent to the user).
  • Implement Ansible Action/Module to be integrated in your Ansible orchestration and let you retrieve the secrets and keymaterials in a secure way from S3
  • Is backed up by a useful command line tool s3vaultcli that automate most of the necessary operations. So no line of code is required to start using S3Vaultlib
  • All the IAM related resource are orchestrated via a very easy YAML based configuration

Architecture

S3Vaultlib implements the automation to create a Vault by setting up the necessary AWS resources.

The Vault is driven by a simple YAML file that describe the access control levels and the roles that can interact with it. A reference YAML is available in s3vaultlib/resources/s3vault.example.yml

The vault is configured by using S3 as backend but secured with enforcing server side encryption via S3 Bucket policies. For each actor that interacts with the vault a new IAM Role and KMS Key + KMS Alias is created. Then the role can access the vault by reading only the specified allowed path and only using the KMS Key that grants it the access to decrypt. You can specify also roles that have read/write access to the specific path of the vault in order to use them to provision keymaterials/secrets by using the specific key attached to the role that will require read access.

The entire configuration is generated by consuming the YAML file and producing the target Cloud Formation template that drives the Vault.

The library and the CLI also implements support for template expansion, to make very easy to abstract the entire vault as variable set to use in a Jinja-based templates.

Example Of Use Case

We need to deploy a nginx instance, we need to provision a server name and a port dynamically and the htpasswd file for basic authentication.

First of all we will have a KMS key with the name (nginx-key) for the role associated to the instance (nginx-role)

Now we provision the secrets in the vault with the cli:

  • nginx configuration:
s3vaultcli configset -b <bucket> -p vault/nginx -c nginx -K server_name -V www.example.com
s3vaultcli configset -b <bucket> -p vault/nginx -c nginx -K server_port -V 8443
  • htpasswd upload:
s3vaultcli push -b <bucket> -p vault/nginx -s htpasswd -d htpasswd

NOTE: the library will try to detect the role and use a KMS key with the same alias of the role name. If we are in another machine (or from our local machine we need to have access to the KMS key and specify the alias with the -k key_alias option)

In S3 now, we will have a structure like this:

$ aws s3 ls s3://<bucket>/vault/nginx/
2017-08-20 18:02:10          5 htpasswd
2017-08-20 18:00:39         57 nginx

Now inside the instance we can prepare the templates to expand, for nginx.conf.j2:

$ cat nginx.conf.j2
server {
                listen       {{ nginx.server_port }};
                server_name  {{ nginx.server_name }};
                access_log  logs/localhost.access.log  main;
                location / {
                    root   html;
                    index  index.html index.htm;
                }
        include /etc/nginx/sites-enabled/*;
        }
}

And the htpasswd.j2:

$ cat htpasswd.j2
{{ htpasswd }}

When the instance starts in the userdata you can use the s3vaultcli tool to render the templates, in this way:

s3vaultcli template -b <bucket> -p vault/nginx -t nginx.conf.j2 -d nginx.conf
s3vaultcli template -b <bucket> -p vault/nginx -t htpasswd -d htpasswd

Provisioning the Vault

The vault can be provisioned by editing a YAML configuration. You can create the YAML configuration file with:

s3vaultcli create_s3vault_config --help

Once you update the file by adding your roles and paths you can produce the Cloud Formation template with:

s3vaultcli create_cloudformation --help

Ansible Module

The library includes also a useful Ansible Module/ActionPlugin that allows you to easily create files from templates. Check the following example:

template.j2:

template test
{{ nginx.server_name }}
port: {{ nginx.port }}
certificate:
{{ cert }}
htpasswd:
{{ htpasswd }}
environment:
{{ ansible_env.PYENV_SHELL }}
environment2:
{{ environment['LOGNAME'] }}

playbook.yml:

---
- name: test my new module
  connection: local
  hosts: localhost
  roles:
    # the role will load the plugins / modules to be used later
    - s3vault
  tasks:
    - name: test
      s3vault_template:
        bucket: 230706054651
        path: vault/nginx/
        kms_alias: gchiesa/testkey
        src: template.j2
        dest: outcome.txt
        ec2: false
        region: eu-west-1

This way the s3vault_template module will take the template, connect to S3, expose the filesystem as variables and you use the files in your template

License

Credits

This package was created with Cookiecutter and the audreyr/cookiecutter-pypackage project template.

History

0.1.0 (2017-08-19)

  • First release on PyPI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, size & hash SHA256 hash help File type Python version Upload date
s3vaultlib-2.0.8.tar.gz (42.9 kB) Copy SHA256 hash SHA256 Source None

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page