Skip to main content

docker compose testing env orchestrator

Project description

🚀 Uber-Compose — Lightweight Docker Compose Extension for Test Environments

🔧 Overview

Uber-Compose is a lightweight extension for managing test environments with Docker Compose. It simplifies infrastructure management for end-to-end (E2E) and integration testing by automatically provisioning services before tests begin and cleaning them up afterward.

It integrates seamlessly with the Vedro testing framework (https://vedro.io) via a dedicated plugin.

With Uber-Compose, you can define test environments, handle multiple docker-compose configurations, and focus entirely on your test scenarios — the infrastructure is managed for you.


✨ Key Features

  • 🚀 Automated setup and teardown of Docker Compose services
  • 🔌 Native plugin integration with Vedro (https://vedro.io)
  • 🧩 Supports multiple docker-compose profiles
  • 🛠️ Flexible command-line control
  • 💻 Works in both local dev and CI/CD environments

📦 Installation

Install via pip:

pip install uber-compose

Or add to your requirements.txt:

uber-compose

🛠️ How to Use with Vedro

1. Setup tests container params and utils

See E2E Test Setup for configure test container for E2E testing

2. Enable the Plugin in vedro.cfg.py

from uber_compose import VedroUberCompose, ComposeConfig, Environment, Service

class Config(vedro.Config):
    class Plugins(vedro.Config.Plugins):
        class UberCompose(VedroUberCompose):
            enabled = True

            # Define Docker Compose services
            default_env = Environment(
                # named from docker-compose.yml
                Service("db"),
                # or simply
                "api",
            )

            # Define Compose profiles
            compose_cfgs = {
                DEFAULT_COMPOSE: ComposeConfig(
                    compose_files="docker-compose.yml",
                ),
                "dev": ComposeConfig(
                    compose_files="docker-compose.yml:docker-compose.dev.yml",
                ),
            }

3. Run Your Tests

Uber-Compose will:

  • Automatically start necessary services
  • Ensure they are fully running before tests begin
  • Restart conflicting services if configurations changed

Everything is handled for you!

Start the test environment:

# Start test container and Docker daemon
docker-compose up -d e2e-tests dockersock

# Run tests
docker-compose exec e2e-tests vedro run scenarios/

Or wrap it with Make like:

make up                    # Start containers
make e2e-run args='scenarios/'   # Run tests

Command Line Options

You can customize behavior dynamically:

  • --uc-fr — Force restart of services
  • --uc-v — Set logging verbosity level
  • --uc-default / --uc-dev — Choose defined ComposeConfigs

🎯 Environment-Specific Test Configurations

You can define custom environments for specific test scenarios and Uber-Compose will automatically provision the required services when running those tests.

Define Custom Environments

Create environment configurations that match your test requirements:

# envs.py
from uber_compose import Environment, Service

WEB_S3_MOCKMQ = Environment(
    Service("s3"),
    Service("mock_mq"),
    Service("cli"),
    Service("api")
)

MINIMAL_DB_ONLY = Environment(
    Service("database")
)

Use in Your Tests

Simply specify the environment in your test scenario:

# test.py
import vedro
from envs import WEB_S3_MOCKMQ

class Scenario(vedro.Scenario):
    subject = 'consume contest mq message without message'
    env = WEB_S3_MOCKMQ

    def when_message_consumed(self):
        # Your test logic here
        pass

Automatic Environment Management

Run your test file and the required environment will be set up automatically:

vedro run test_path.py

Uber-Compose will:

  • ✅ Detect the custom environment specified in your test
  • 🚀 Start only the required services (s3, mock_mq, cli, api)
  • ⏱️ Wait for all services to be healthy before running the test
  • 🧹 Clean up resources after test completion

This approach ensures each test gets exactly the infrastructure it needs, improving test isolation and reducing resource usage.


📚 Library Usage

🛠️ Development Guide


✔️ Ideal For

  • ✅ End-to-End (E2E) testing
  • 🔗 Integration testing
  • 🧪 Local development & reproducible CI pipelines
  • 🎯 Structured tests with Vedro (https://vedro.io)

🤝 Contribute

We welcome pull requests, feature requests, and community feedback!

📍 Source Repository:
https://github.com/ko10ok/uber-compose


🧰 One Command. Fully Managed Environments.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

uber_compose-2.1.0a1.tar.gz (38.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

uber_compose-2.1.0a1-py3-none-any.whl (49.4 kB view details)

Uploaded Python 3

File details

Details for the file uber_compose-2.1.0a1.tar.gz.

File metadata

  • Download URL: uber_compose-2.1.0a1.tar.gz
  • Upload date:
  • Size: 38.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.19

File hashes

Hashes for uber_compose-2.1.0a1.tar.gz
Algorithm Hash digest
SHA256 4003f647d9a61e3caa2749a71cc0423de06967a35a63ca524486c76f58f5357d
MD5 613757fa95fc1f03e7705f8c2d4606f2
BLAKE2b-256 ed8eb9690cd781ed048862d50d7cbd11622fc165b25905bfe36fe7d8d0849257

See more details on using hashes here.

File details

Details for the file uber_compose-2.1.0a1-py3-none-any.whl.

File metadata

  • Download URL: uber_compose-2.1.0a1-py3-none-any.whl
  • Upload date:
  • Size: 49.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.19

File hashes

Hashes for uber_compose-2.1.0a1-py3-none-any.whl
Algorithm Hash digest
SHA256 9458ef91ec4e03a31740f9a764d6852f6be97761fd01ff19e7b9d7519e71ade2
MD5 855d1ea6ce6a9853a1c04573ee27213f
BLAKE2b-256 0affb56d25910516d4b82b2a1871aab3f694c94a07ee840affb405eb5dfcf77f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page