Skip to main content

docker compose testing env orchestrator

Project description

🚀 Uber-Compose — Lightweight Docker Compose Extension for Test Environments

🔧 Overview

Uber-Compose is a lightweight extension for managing test environments with Docker Compose. It simplifies infrastructure management for end-to-end (E2E) and integration testing by automatically provisioning services before tests begin and cleaning them up afterward.

It integrates seamlessly with the Vedro testing framework (https://vedro.io) via a dedicated plugin.

With Uber-Compose, you can define test environments, handle multiple docker-compose configurations, and focus entirely on your test scenarios — the infrastructure is managed for you.


✨ Key Features

  • 🚀 Automated setup and teardown of Docker Compose services
  • 🔌 Native plugin integration with Vedro (https://vedro.io)
  • 🧩 Supports multiple docker-compose profiles
  • 🛠️ Flexible command-line control
  • 💻 Works in both local dev and CI/CD environments

📦 Installation

Install via pip:

pip install uber-compose

Or add to your requirements.txt:

uber-compose

🛠️ How to Use with Vedro

1. Setup tests container params and utils

See E2E Test Setup for configure test container for E2E testing

2. Enable the Plugin in vedro.cfg.py

from uber_compose import VedroUberCompose, ComposeConfig, Environment, Service

class Config(vedro.Config):
    class Plugins(vedro.Config.Plugins):
        class UberCompose(VedroUberCompose):
            enabled = True

            # Define Docker Compose services
            default_env = Environment(
                # named from docker-compose.yml
                Service("db"),
                # or simply
                "api",
            )

            # Define Compose profiles
            compose_cfgs = {
                DEFAULT_COMPOSE: ComposeConfig(
                    compose_files="docker-compose.yml",
                ),
                "dev": ComposeConfig(
                    compose_files="docker-compose.yml:docker-compose.dev.yml",
                ),
            }

3. Run Your Tests

Uber-Compose will:

  • Automatically start necessary services
  • Ensure they are fully running before tests begin
  • Restart conflicting services if configurations changed

Everything is handled for you!

Start the test environment:

# Start test container and Docker daemon
docker-compose up -d e2e-tests dockersock

# Run tests
docker-compose exec e2e-tests vedro run scenarios/

Or wrap it with Make like:

make up                    # Start containers
make e2e-run args='scenarios/'   # Run tests

Command Line Options

You can customize behavior dynamically:

  • --uc-fr — Force restart of services
  • --uc-v — Set logging verbosity level
  • --uc-default / --uc-dev — Choose defined ComposeConfigs

🎯 Environment-Specific Test Configurations

You can define custom environments for specific test scenarios and Uber-Compose will automatically provision the required services when running those tests.

Define Custom Environments

Create environment configurations that match your test requirements:

# envs.py
from uber_compose import Environment, Service

WEB_S3_MOCKMQ = Environment(
    Service("s3"),
    Service("mock_mq"),
    Service("cli"),
    Service("api")
)

MINIMAL_DB_ONLY = Environment(
    Service("database")
)

Use in Your Tests

Simply specify the environment in your test scenario:

# test.py
import vedro
from envs import WEB_S3_MOCKMQ

class Scenario(vedro.Scenario):
    subject = 'consume contest mq message without message'
    env = WEB_S3_MOCKMQ

    def when_message_consumed(self):
        # Your test logic here
        pass

Automatic Environment Management

Run your test file and the required environment will be set up automatically:

vedro run test_path.py

Uber-Compose will:

  • ✅ Detect the custom environment specified in your test
  • 🚀 Start only the required services (s3, mock_mq, cli, api)
  • ⏱️ Wait for all services to be healthy before running the test
  • 🧹 Clean up resources after test completion

This approach ensures each test gets exactly the infrastructure it needs, improving test isolation and reducing resource usage.


📚 Library Usage

🛠️ Development Guide


✔️ Ideal For

  • ✅ End-to-End (E2E) testing
  • 🔗 Integration testing
  • 🧪 Local development & reproducible CI pipelines
  • 🎯 Structured tests with Vedro (https://vedro.io)

🤝 Contribute

We welcome pull requests, feature requests, and community feedback!

📍 Source Repository:
https://github.com/ko10ok/uber-compose


🧰 One Command. Fully Managed Environments.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

uber_compose-2.0.0a5.tar.gz (38.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

uber_compose-2.0.0a5-py3-none-any.whl (49.5 kB view details)

Uploaded Python 3

File details

Details for the file uber_compose-2.0.0a5.tar.gz.

File metadata

  • Download URL: uber_compose-2.0.0a5.tar.gz
  • Upload date:
  • Size: 38.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.19

File hashes

Hashes for uber_compose-2.0.0a5.tar.gz
Algorithm Hash digest
SHA256 ef39379e36f7854d11c4d52ad0369c0e634593d1ffb2fa8d3cd6fed31f0c2c99
MD5 3d42d89a9744f235db592123132bd12a
BLAKE2b-256 1e79512b684f5180f988689e37a9b923c84d46c5a2c681b8ea4a4677617e06a2

See more details on using hashes here.

File details

Details for the file uber_compose-2.0.0a5-py3-none-any.whl.

File metadata

  • Download URL: uber_compose-2.0.0a5-py3-none-any.whl
  • Upload date:
  • Size: 49.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.19

File hashes

Hashes for uber_compose-2.0.0a5-py3-none-any.whl
Algorithm Hash digest
SHA256 5da2b726f2b63675cd41d3bd6d7d09d0e53692409966ce53b4cfb2ee5086a437
MD5 5c3732b0684122d37f1b7142dc454da7
BLAKE2b-256 00aa4781cfffb1020ff1f3b9153b5ccfd077c0e556b148078cc1859c1a508976

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page