Skip to main content

🐙 Multi-armed mocks for LLM apps - Drop-in replacement for OpenAI/Anthropic APIs for deterministic testing

Project description

🐙 Mocktopus

Multi-armed mocks for LLM apps

Mocktopus is a drop-in replacement for OpenAI/Anthropic APIs, designed to make your LLM application tests fast, deterministic, and cost-free.

CI PyPI License: MIT

Why Mocktopus?

Testing LLM applications is challenging:

  • Non-deterministic: Same prompt, different responses
  • Expensive: Every test run costs API credits
  • Slow: API calls add latency to test suites
  • Network-dependent: Can't run tests offline
  • Complex workflows: Tool calls and streaming complicate testing

Mocktopus solves these problems by providing a local mock server that perfectly mimics LLM APIs.

Features

Drop-in replacement - Just change your base URL ✅ Deterministic responses - Same input → same output ✅ Tool/function calling - Full support for complex workflows ✅ Streaming - Server-sent events (SSE) support ✅ Multiple providers - OpenAI and Anthropic compatible ✅ Zero cost - No API charges for tests ✅ Fast - No network latency ✅ Offline - Run tests without internet

Installation

pip install mocktopus

Quick Start

1. Create a scenario file (scenario.yaml):

version: 1
rules:
  - type: llm.openai
    when:
      model: "gpt-4*"
      messages_contains: "hello"
    respond:
      content: "Hello! How can I help you today?"

2. Start the mock server:

mocktopus serve -s scenario.yaml

3. Point your app to Mocktopus:

from openai import OpenAI

# Instead of the real API:
# client = OpenAI(api_key="sk-...")

# Use Mocktopus:
client = OpenAI(
    base_url="http://localhost:8080/v1",
    api_key="mock-key"  # Any string works
)

response = client.chat.completions.create(
    model="gpt-4",
    messages=[{"role": "user", "content": "hello"}]
)
print(response.choices[0].message.content)
# Output: "Hello! How can I help you today?"

Usage Modes

Mock Mode (Default)

Use predefined YAML scenarios for deterministic responses:

mocktopus serve -s examples/chat-basic.yaml

Record Mode (Coming Soon)

Proxy and record real API calls for later replay:

mocktopus serve --mode record --recordings-dir ./recordings

Replay Mode (Coming Soon)

Replay previously recorded API interactions:

mocktopus serve --mode replay --recordings-dir ./recordings

Scenario Examples

Basic Chat Response

version: 1
rules:
  - type: llm.openai
    when:
      messages_contains: "weather"
    respond:
      content: "It's sunny today!"

Function Calling

version: 1
rules:
  - type: llm.openai
    when:
      messages_contains: "weather"
    respond:
      tool_calls:
        - id: "call_123"
          type: "function"
          function:
            name: "get_weather"
            arguments: '{"location": "San Francisco"}'

Streaming Response

version: 1
rules:
  - type: llm.openai
    when:
      model: "*"
    respond:
      content: "This will be streamed..."
      delay_ms: 50  # Delay between chunks
      chunk_size: 5  # Characters per chunk

Limited Usage

version: 1
rules:
  - type: llm.openai
    when:
      messages_contains: "test"
    times: 3  # Only responds 3 times
    respond:
      content: "Limited response"

CLI Commands

Start Server

# Basic usage
mocktopus serve -s scenario.yaml

# Custom port
mocktopus serve -s scenario.yaml -p 9000

# Verbose logging
mocktopus serve -s scenario.yaml -v

Test Scenarios

# Validate a scenario file
mocktopus validate scenario.yaml

# Simulate a request without starting server
mocktopus simulate -s scenario.yaml --prompt "Hello"

# Generate example scenarios
mocktopus example --type basic > my-scenario.yaml
mocktopus example --type tools > tools-scenario.yaml

Testing with Mocktopus

Pytest Integration

import pytest
from mocktopus import use_mocktopus

def test_my_llm_app(use_mocktopus):
    # Load scenario
    use_mocktopus.load_yaml("tests/scenarios/test.yaml")

    # Get a client
    client = use_mocktopus.openai_client()

    # Test your app
    response = client.chat.completions.create(
        model="gpt-4",
        messages=[{"role": "user", "content": "test"}]
    )
    assert "expected" in response.choices[0].message.content

Continuous Integration

# .github/workflows/test.yml
name: Tests
on: [push, pull_request]

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
      - run: pip install -e .
      - run: mocktopus serve -s tests/scenarios.yaml &
      - run: pytest  # Your tests hit localhost:8080

Advanced Features

Pattern Matching

Mocktopus supports multiple matching strategies:

  • Exact match: messages_contains: "exact phrase"
  • Regex: messages_regex: "\\d+ items?"
  • Glob: model: "gpt-4*"

Response Configuration

respond:
  content: "Response text"
  delay_ms: 100  # Simulate latency
  usage:
    input_tokens: 10
    output_tokens: 20
  # For streaming
  chunk_size: 10  # Characters per chunk

Roadmap

  • OpenAI chat completions API
  • Streaming support (SSE)
  • Function/tool calling
  • Anthropic messages API
  • Recording & replay
  • Embeddings API
  • Assistants API
  • Image generation
  • Semantic similarity matching
  • Response templating
  • Load testing mode

Contributing

We welcome contributions! See our Contributing Guide for details.

License

MIT - See LICENSE for details.

Links


Made with 🐙 by EvalOps

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mocktopus-0.1.2.tar.gz (24.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mocktopus-0.1.2-py3-none-any.whl (20.6 kB view details)

Uploaded Python 3

File details

Details for the file mocktopus-0.1.2.tar.gz.

File metadata

  • Download URL: mocktopus-0.1.2.tar.gz
  • Upload date:
  • Size: 24.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for mocktopus-0.1.2.tar.gz
Algorithm Hash digest
SHA256 13675de9a87d7995b69285fe77c336fb4201a60ab76f0d5e3c3cd240531693f3
MD5 31c954015c041a775454b650ce0d8dae
BLAKE2b-256 b24fb8e0042e56769bd01663e5b354d5466b784de75492f3477c10f7ca6e2c0e

See more details on using hashes here.

File details

Details for the file mocktopus-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: mocktopus-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 20.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.7

File hashes

Hashes for mocktopus-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 5f798342e19f8c141d78bc51e12987aa42ce7bb7f5d48776295bdbdcb36d7b12
MD5 a65e224b25d0741770a419f3cce4d078
BLAKE2b-256 14f299ccd0ab86545f44d67c39a91115a881b975f7f45eb181b086531a151756

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page