Skip to main content

Reusable MLServer test utilities for aiSSEMBLE Inference

Project description

aiSSEMBLE OIP Common Test Utilities

Reusable MLServer test utilities for aiSSEMBLE Inference modules and examples.

Overview

This module provides consolidated MLServer lifecycle management utilities to eliminate code duplication across test suites. It offers two primary usage patterns:

  1. Simple mode: For examples with static model directories
  2. Dynamic mode: For module tests with temporary config generation

Installation

# As a test dependency in pyproject.toml
[dependency-groups]
test = [
    "aissemble-inference-common-test",
]

[tool.uv.sources]
aissemble-inference-common-test = { path = "../aissemble-inference-common-test", editable = true }

Usage

Simple Mode (Examples)

For tests that use pre-configured model directories:

# tests/features/environment.py
from pathlib import Path
from aissemble_inference_common_test.behave_helpers import (
    setup_mlserver_simple,
    teardown_mlserver,
)

def before_all(context):
    example_dir = Path(__file__).parent.parent.parent
    models_dir = example_dir / "models"

    setup_mlserver_simple(context, models_dir=models_dir, port=8080)
    context.mlserver_fixture.start()

def after_all(context):
    teardown_mlserver(context)

Dynamic Mode (Module Tests)

For tests that generate model configurations dynamically:

# tests/features/environment.py
from aissemble_inference_common_test.behave_helpers import (
    setup_mlserver_dynamic,
    teardown_mlserver,
    start_mlserver_with_model,
)

def before_all(context):
    setup_mlserver_dynamic(context)

def after_scenario(context, scenario):
    if hasattr(context, "mlserver_fixture") and context.mlserver_fixture.process:
        context.mlserver_fixture.stop()

def after_all(context):
    teardown_mlserver(context)

# In your step definitions:
start_mlserver_with_model(
    context,
    model_name="yolo",
    runtime="aissemble_inference_yolo.YOLORuntime",
    model="yolov8n.pt"
)

Features

  • Process Management: Robust MLServer subprocess handling with zombie process detection
  • Health Checking: Automatic polling of /v2/health/ready endpoint
  • Graceful Shutdown: Terminate with timeout, then kill if needed (with safety timeouts)
  • Dynamic Port Allocation: Automatic free port discovery
  • Config Generation: JSON settings and model-settings creation
  • Context Manager Support: Automatic cleanup with Python with statement
  • Error Logging: Detailed warnings for cleanup and shutdown failures
  • Backward Compatible: Drop-in replacement for existing test code

API Reference

MLServerFixture

Main fixture class for MLServer lifecycle management.

Basic Usage

from aissemble_inference_common_test import MLServerFixture

# Simple mode
fixture = MLServerFixture.simple(port=8080, models_dir=Path("models"))
fixture.start()

# Dynamic mode
fixture = MLServerFixture.dynamic()
fixture.start_with_model(
    model_name="yolo",
    runtime="aissemble_inference_yolo.YOLORuntime",
    model="yolov8n.pt"
)

# Cleanup
fixture.stop()
fixture.cleanup()

Context Manager Usage (Recommended)

For automatic cleanup even when tests fail:

from aissemble_inference_common_test import MLServerFixture
from pathlib import Path

# Simple mode with context manager
with MLServerFixture.simple(port=8080, models_dir=Path("models")) as fixture:
    fixture.start()
    # Run tests...
    # Automatic cleanup on exit

# Dynamic mode with context manager
with MLServerFixture.dynamic() as fixture:
    fixture.start_with_model(
        model_name="yolo",
        runtime="aissemble_inference_yolo.YOLORuntime",
        model="yolov8n.pt"
    )
    # Run tests...
    # Automatic stop() and cleanup() on exit, even if exception occurs

Behave Helpers

  • setup_mlserver_simple(context, models_dir, port) - Initialize simple fixture
  • setup_mlserver_dynamic(context) - Initialize dynamic fixture
  • start_mlserver_with_model(context, model_name, runtime, **params) - Start with config
  • teardown_mlserver(context) - Cleanup and shutdown

License

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aissemble_inference_common_test-1.5.0rc3.tar.gz (151.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file aissemble_inference_common_test-1.5.0rc3.tar.gz.

File metadata

  • Download URL: aissemble_inference_common_test-1.5.0rc3.tar.gz
  • Upload date:
  • Size: 151.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.24 {"installer":{"name":"uv","version":"0.9.24","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for aissemble_inference_common_test-1.5.0rc3.tar.gz
Algorithm Hash digest
SHA256 6dd41a6b091f3debb8e682ea6461d6072ed9ab2441df7961a079721e7ce56141
MD5 805d9721e087075828c5761238581b07
BLAKE2b-256 24775ca8202d190c7c1f4e8849ce04f48f63af86d83e7dd188ebd333de4c7504

See more details on using hashes here.

File details

Details for the file aissemble_inference_common_test-1.5.0rc3-py3-none-any.whl.

File metadata

  • Download URL: aissemble_inference_common_test-1.5.0rc3-py3-none-any.whl
  • Upload date:
  • Size: 14.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.24 {"installer":{"name":"uv","version":"0.9.24","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for aissemble_inference_common_test-1.5.0rc3-py3-none-any.whl
Algorithm Hash digest
SHA256 0e87648c317b53e09321c9634b902b31fd9c8d694b7560a6933877a8b78aabe4
MD5 6c2bb61c62f69adbd3209f46ebbbb89b
BLAKE2b-256 5a72b4599bdc91ca0723588bd354712f1e8afc5111ecaf2c21b8b44ce25dd3d1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page