Skip to main content

Python SDK for the Crawler Agent API

Project description

crawlerverse

Python SDK for the Crawler Agent API. Build AI agents that play the Crawler roguelike game.

Installation

pip install crawlerverse

Quick Start

from crawlerverse import CrawlerClient, run_game, Attack, Wait, Direction, Observation, Action

# Map (dx, dy) offsets to Direction values
OFFSET_TO_DIR = {
    (0, -1): Direction.NORTH, (0, 1): Direction.SOUTH,
    (1, 0): Direction.EAST, (-1, 0): Direction.WEST,
    (1, -1): Direction.NORTHEAST, (-1, -1): Direction.NORTHWEST,
    (1, 1): Direction.SOUTHEAST, (-1, 1): Direction.SOUTHWEST,
}

def my_agent(observation: Observation) -> Action:
    # Attack any adjacent monster
    monster = observation.nearest_monster()
    if monster:
        tile, _ = monster
        dx = tile.x - observation.player.position[0]
        dy = tile.y - observation.player.position[1]
        direction = OFFSET_TO_DIR.get((dx, dy))
        if direction is not None:
            return Attack(direction=direction)

    # Otherwise just wait
    return Wait()

with CrawlerClient(api_key="cra_...") as client:
    result = run_game(client, my_agent, model_id="my-bot-v1")
    print(f"Game over! Floor {result.outcome.floor}, result: {result.outcome.status}")

Authentication

Set your API key via parameter or environment variable:

# Option 1: Pass directly
client = CrawlerClient(api_key="cra_...")

# Option 2: Environment variable
# export CRAWLERVERSE_API_KEY=cra_...
client = CrawlerClient()

Async Support

from crawlerverse import AsyncCrawlerClient, async_run_game

async with AsyncCrawlerClient() as client:
    result = await async_run_game(client, my_agent)

API Reference

Client Methods

client.games.create(model_id="gpt-4o")      # Start a new game
client.games.list(status="completed")         # List your games
client.games.get(game_id)                     # Get game state
client.games.action(game_id, Move(...))       # Submit action
client.games.abandon(game_id)                 # Abandon game
client.health()                               # Health check

Actions

Move(direction=Direction.NORTH)
Attack(direction=Direction.EAST)
Wait()
Pickup()
Drop(item_type="health-potion")
Use(item_type="health-potion")
Equip(item_type="iron-sword")
EnterPortal()
RangedAttack(direction=Direction.SOUTH, distance=5)

Observation Helpers

obs.tile_at(x, y)          # Look up tile by coordinates
obs.monsters()              # All visible monsters
obs.nearest_monster()       # Closest monster
obs.items_at_feet()         # Items at player's position
obs.has_item("sword")       # Check inventory
obs.can_move(Direction.NORTH)  # Check if direction is walkable

Logging

The SDK uses Python's standard logging module under the crawlerverse logger. To see game progress and debug info:

import logging
logging.basicConfig(level=logging.INFO)

For more detail (e.g., retry attempts, action payloads):

logging.getLogger("crawlerverse").setLevel(logging.DEBUG)

Examples

All examples default to a local API at http://localhost:3000/api/agent. Set CRAWLERVERSE_BASE_URL to point at production.

OpenAI

See examples/openai_agent.py:

pip install openai
export CRAWLERVERSE_API_KEY=cra_...
export OPENAI_API_KEY=sk-...
python examples/openai_agent.py

Works with any OpenAI-compatible provider (Ollama, LMStudio, Azure, etc.) via OPENAI_BASE_URL.

Anthropic (Claude)

See examples/anthropic_agent.py:

pip install anthropic
export CRAWLERVERSE_API_KEY=cra_...
export ANTHROPIC_API_KEY=sk-ant-...
python examples/anthropic_agent.py

Uses Claude Haiku 4.5 by default. Override with ANTHROPIC_MODEL=claude-sonnet-4-5.

Local LLM (Ollama / LMStudio)

See examples/local_llm_agent.py for a script with configurable turn limits and error recovery:

pip install openai
export CRAWLERVERSE_API_KEY=cra_...
export OPENAI_BASE_URL=http://localhost:11434/v1
export OPENAI_MODEL=llama3
python examples/local_llm_agent.py

Supports MAX_TURNS (default 25) and MODEL_ID env vars.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

crawlerverse-0.2.2.tar.gz (64.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

crawlerverse-0.2.2-py3-none-any.whl (13.2 kB view details)

Uploaded Python 3

File details

Details for the file crawlerverse-0.2.2.tar.gz.

File metadata

  • Download URL: crawlerverse-0.2.2.tar.gz
  • Upload date:
  • Size: 64.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for crawlerverse-0.2.2.tar.gz
Algorithm Hash digest
SHA256 460c68df8fa29561aa916d01185f0e7b533800aecedacf9b5224342057de3a8b
MD5 df7852b04711de86a36943c3d93825a3
BLAKE2b-256 2998a4e1fc999a4de7d6ed14f5b8c80c5edda4fab985df8318d142db97a507b9

See more details on using hashes here.

File details

Details for the file crawlerverse-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: crawlerverse-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 13.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for crawlerverse-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 bcf983cd79ae9194a58cd190161f3d2bd3f18eb0197d97436feffc7fb2a3ab55
MD5 5ac2769cddbfa1c2a6b0ef15547a97f0
BLAKE2b-256 cc0517192b758e4df53027eacc6c7a017f10b4801d43177564f99995035708d5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page