Skip to main content

Python SDK for the Crawler Agent API

Project description

crawlerverse

Python SDK for the Crawler Agent API. Build AI agents that play the Crawler roguelike game.

Installation

pip install crawlerverse

Quick Start

from crawlerverse import CrawlerClient, run_game, Attack, Wait, Direction, Observation, Action

# Map (dx, dy) offsets to Direction values
OFFSET_TO_DIR = {
    (0, -1): Direction.NORTH, (0, 1): Direction.SOUTH,
    (1, 0): Direction.EAST, (-1, 0): Direction.WEST,
    (1, -1): Direction.NORTHEAST, (-1, -1): Direction.NORTHWEST,
    (1, 1): Direction.SOUTHEAST, (-1, 1): Direction.SOUTHWEST,
}

def my_agent(observation: Observation) -> Action:
    # Attack any adjacent monster
    monster = observation.nearest_monster()
    if monster:
        tile, _ = monster
        dx = tile.x - observation.player.position[0]
        dy = tile.y - observation.player.position[1]
        direction = OFFSET_TO_DIR.get((dx, dy))
        if direction is not None:
            return Attack(direction=direction)

    # Otherwise just wait
    return Wait()

with CrawlerClient(api_key="cra_...") as client:
    result = run_game(client, my_agent, model_id="my-bot-v1")
    print(f"Game over! Floor {result.outcome.floor}, result: {result.outcome.status}")

Authentication

Set your API key via parameter or environment variable:

# Option 1: Pass directly
client = CrawlerClient(api_key="cra_...")

# Option 2: Environment variable
# export CRAWLERVERSE_API_KEY=cra_...
client = CrawlerClient()

Async Support

from crawlerverse import AsyncCrawlerClient, async_run_game

async with AsyncCrawlerClient() as client:
    result = await async_run_game(client, my_agent)

API Reference

Client Methods

client.games.create(model_id="gpt-4o")      # Start a new game
client.games.list(status="completed")         # List your games
client.games.get(game_id)                     # Get game state
client.games.action(game_id, Move(...))       # Submit action
client.games.abandon(game_id)                 # Abandon game
client.health()                               # Health check

Actions

Move(direction=Direction.NORTH)
Attack(direction=Direction.EAST)
Wait()
Pickup()
Drop(item_type="health-potion")
Use(item_type="health-potion")
Equip(item_type="iron-sword")
EnterPortal()
RangedAttack(direction=Direction.SOUTH, distance=5)

Observation Helpers

obs.tile_at(x, y)          # Look up tile by coordinates
obs.monsters()              # All visible monsters
obs.nearest_monster()       # Closest monster
obs.items_at_feet()         # Items at player's position
obs.has_item("sword")       # Check inventory
obs.can_move(Direction.NORTH)  # Check if direction is walkable

Logging

The SDK uses Python's standard logging module under the crawlerverse logger. To see game progress and debug info:

import logging
logging.basicConfig(level=logging.INFO)

For more detail (e.g., retry attempts, action payloads):

logging.getLogger("crawlerverse").setLevel(logging.DEBUG)

Examples

All examples default to a local API at http://localhost:3000/api/agent. Set CRAWLERVERSE_BASE_URL to point at production.

OpenAI

See examples/openai_agent.py:

pip install openai
export CRAWLERVERSE_API_KEY=cra_...
export OPENAI_API_KEY=sk-...
python examples/openai_agent.py

Works with any OpenAI-compatible provider (Ollama, LMStudio, Azure, etc.) via OPENAI_BASE_URL.

Anthropic (Claude)

See examples/anthropic_agent.py:

pip install anthropic
export CRAWLERVERSE_API_KEY=cra_...
export ANTHROPIC_API_KEY=sk-ant-...
python examples/anthropic_agent.py

Uses Claude Haiku 4.5 by default. Override with ANTHROPIC_MODEL=claude-sonnet-4-5.

Local LLM (Ollama / LMStudio)

See examples/local_llm_agent.py for a script with configurable turn limits and error recovery:

pip install openai
export CRAWLERVERSE_API_KEY=cra_...
export OPENAI_BASE_URL=http://localhost:11434/v1
export OPENAI_MODEL=llama3
python examples/local_llm_agent.py

Supports MAX_TURNS (default 25) and MODEL_ID env vars.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

crawlerverse-0.2.1.tar.gz (64.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

crawlerverse-0.2.1-py3-none-any.whl (13.2 kB view details)

Uploaded Python 3

File details

Details for the file crawlerverse-0.2.1.tar.gz.

File metadata

  • Download URL: crawlerverse-0.2.1.tar.gz
  • Upload date:
  • Size: 64.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for crawlerverse-0.2.1.tar.gz
Algorithm Hash digest
SHA256 fa84ce0417a43e78d412f21d5aca2e8b1ef85b78e1de01b23c7aff5d66cddb20
MD5 be6fc7bb4e0c9647fcbf7673242ea888
BLAKE2b-256 2ab30e7a88652a347964fc7e288ea54754c77cd03257dd952dc25d58459ae3f2

See more details on using hashes here.

File details

Details for the file crawlerverse-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: crawlerverse-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 13.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.4 {"installer":{"name":"uv","version":"0.10.4","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for crawlerverse-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f7a0220be96cef5bc7d0dbdb633d013804980648b62c999239c2da240fbafef8
MD5 c96b4b6dd179ae46bda4ee5e41c72826
BLAKE2b-256 09a7f3410dc73354e5af0099ceab24670f61ff81c3cd5e00094e8542cdf1df18

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page