Skip to main content

Python SDK for the Crawler Agent API

Project description

crawlerverse

Python SDK for the Crawler Agent API. Build AI agents that play the Crawler roguelike game.

Installation

pip install crawlerverse

Quick Start

from crawlerverse import CrawlerClient, run_game, Attack, Wait, Direction, Observation, Action

# Map (dx, dy) offsets to Direction values
OFFSET_TO_DIR = {
    (0, -1): Direction.NORTH, (0, 1): Direction.SOUTH,
    (1, 0): Direction.EAST, (-1, 0): Direction.WEST,
    (1, -1): Direction.NORTHEAST, (-1, -1): Direction.NORTHWEST,
    (1, 1): Direction.SOUTHEAST, (-1, 1): Direction.SOUTHWEST,
}

def my_agent(observation: Observation) -> Action:
    # Attack any adjacent monster
    monster = observation.nearest_monster()
    if monster:
        tile, _ = monster
        dx = tile.x - observation.player.position[0]
        dy = tile.y - observation.player.position[1]
        direction = OFFSET_TO_DIR.get((dx, dy))
        if direction is not None:
            return Attack(direction=direction)

    # Otherwise just wait
    return Wait()

with CrawlerClient(api_key="cra_...") as client:
    result = run_game(client, my_agent, model_id="my-bot-v1")
    print(f"Game over! Floor {result.outcome.floor}, result: {result.outcome.status}")

Authentication

Set your API key via parameter or environment variable:

# Option 1: Pass directly
client = CrawlerClient(api_key="cra_...")

# Option 2: Environment variable
# export CRAWLERVERSE_API_KEY=cra_...
client = CrawlerClient()

Async Support

from crawlerverse import AsyncCrawlerClient, async_run_game

async with AsyncCrawlerClient() as client:
    result = await async_run_game(client, my_agent)

API Reference

Client Methods

client.games.create(model_id="gpt-4o")      # Start a new game
client.games.list(status="completed")         # List your games
client.games.get(game_id)                     # Get game state
client.games.action(game_id, Move(...))       # Submit action
client.games.abandon(game_id)                 # Abandon game
client.health()                               # Health check

Actions

Move(direction=Direction.NORTH)
Attack(direction=Direction.EAST)
Wait()
Pickup()
Drop(item_type="health-potion")
Use(item_type="health-potion")
Equip(item_type="iron-sword")
EnterPortal()
RangedAttack(direction=Direction.SOUTH, distance=5)

Observation Helpers

obs.tile_at(x, y)          # Look up tile by coordinates
obs.monsters()              # All visible monsters
obs.nearest_monster()       # Closest monster
obs.items_at_feet()         # Items at player's position
obs.has_item("sword")       # Check inventory
obs.can_move(Direction.NORTH)  # Check if direction is walkable

Logging

The SDK uses Python's standard logging module under the crawlerverse logger. To see game progress and debug info:

import logging
logging.basicConfig(level=logging.INFO)

For more detail (e.g., retry attempts, action payloads):

logging.getLogger("crawlerverse").setLevel(logging.DEBUG)

Examples

All examples default to a local API at http://localhost:3000/api/agent. Set CRAWLERVERSE_BASE_URL to point at production.

OpenAI

See examples/openai_agent.py:

pip install openai
export CRAWLERVERSE_API_KEY=cra_...
export OPENAI_API_KEY=sk-...
python examples/openai_agent.py

Works with any OpenAI-compatible provider (Ollama, LMStudio, Azure, etc.) via OPENAI_BASE_URL.

Anthropic (Claude)

See examples/anthropic_agent.py:

pip install anthropic
export CRAWLERVERSE_API_KEY=cra_...
export ANTHROPIC_API_KEY=sk-ant-...
python examples/anthropic_agent.py

Uses Claude Haiku 4.5 by default. Override with ANTHROPIC_MODEL=claude-sonnet-4-5.

Local LLM (Ollama / LMStudio)

See examples/local_llm_agent.py for a script with configurable turn limits and error recovery:

pip install openai
export CRAWLERVERSE_API_KEY=cra_...
export OPENAI_BASE_URL=http://localhost:11434/v1
export OPENAI_MODEL=llama3
python examples/local_llm_agent.py

Supports MAX_TURNS (default 25) and MODEL_ID env vars.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

crawlerverse-0.2.0.tar.gz (64.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

crawlerverse-0.2.0-py3-none-any.whl (13.2 kB view details)

Uploaded Python 3

File details

Details for the file crawlerverse-0.2.0.tar.gz.

File metadata

  • Download URL: crawlerverse-0.2.0.tar.gz
  • Upload date:
  • Size: 64.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for crawlerverse-0.2.0.tar.gz
Algorithm Hash digest
SHA256 0983aede6c6c725f5b34b442ff89b59b94c0651d684318a5978d3453d09b652b
MD5 b5fd74cb91def431126c3b6e680658d8
BLAKE2b-256 b9a18cec3a51b223d822401621ccaca6456f203217760f47ecc468b6fc41ff8b

See more details on using hashes here.

File details

Details for the file crawlerverse-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: crawlerverse-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 13.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for crawlerverse-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 71c53f40b1297a463a66bfe6f2df5f521ded781544c46c349b7d918ae9262aab
MD5 a6ce03ab57419291d09aaad0e727121d
BLAKE2b-256 6363679258773f5d2cb25a1e6592f0d8b0dd436c96eb9d1ac0c2638f917a9dbd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page