A local, lightweight, learning-only companion creature. Drakeling may optionally be linked to the OpenClaw ecosystem
Project description
Drakeling
A local, lightweight, learning-only companion creature. Drakeling may optionally be linked to the OpenClaw ecosystem.
Drakeling is a small digital dragon that lives on your machine. It reflects, learns about you, and expresses feelings — but never performs tasks, accesses files, or reaches the network. Safe by architecture.
Prerequisites
- Python 3.12+
- One of:
pip,pipx, oruv
Installation
Using pipx (recommended — isolated environment)
pipx install drakeling
Using pip
pip install drakeling
Using uv
uv tool install drakeling
After installation, two commands are available:
| Command | Purpose |
|---|---|
drakelingd |
Start the background daemon (HTTP API on 127.0.0.1:52780) |
drakeling |
Launch the interactive terminal UI |
Getting started
Order matters: Start the daemon first, then the UI in a separate terminal.
1. Start the daemon
drakelingd
On first run, the daemon:
- creates the platform data directory (see Data directory below)
- walks you through an interactive LLM setup — pick your provider, enter
your endpoint URL and credentials, and the daemon writes a
.envfile for you - generates an ed25519 identity keypair (machine binding)
- generates a local API token
- begins listening on
http://127.0.0.1:52780
Leave the daemon running in its own terminal (or set it up as a background service — see Running as a service).
2. Launch the terminal UI
In a separate terminal:
drakeling
If no creature exists, the UI walks you through the birth ceremony: pick a colour, optionally re-roll up to 3 times, name your dragon, and confirm. Your drakeling starts as an egg and progresses through lifecycle stages as you interact with it.
3. Interact
| Key | Action | What it does |
|---|---|---|
| F2 | Care | Show gentle attention — lifts mood, eases loneliness |
| F3 | Rest | Put your creature to sleep — recovers energy and stability |
| F4 / Ctrl+T | Talk | Focus the text input, type a message and press Enter |
| F5 / Ctrl+F | Feed | Feed your creature — boosts energy and mood |
| F1 / ? | Help | Open the in-app help overlay |
| F8 | Release | Say goodbye (irreversible) |
Talking requires an LLM provider — see LLM configuration. Talking lifts mood, builds trust, sparks curiosity, and eases loneliness.
Embedded terminals (Zed, VS Code, etc.) may intercept F-keys. Use the alternative bindings shown above (?, Ctrl+T, Ctrl+F) when F-keys do not work.
Data directory
All persistent state lives in a platform-specific data directory:
| Platform | Path |
|---|---|
| Linux | ~/.local/share/drakeling/ |
| macOS | ~/Library/Application Support/drakeling/ |
| Windows | %APPDATA%\drakeling\drakeling\ |
Contents:
| File | Purpose |
|---|---|
drakeling.db |
SQLite database (creature state, memory, interaction log, lifecycle events) |
identity.key |
Ed25519 private key — ties the creature to this machine |
api_token |
Bearer token for authenticating API requests |
.env |
Optional — environment variable overrides (see below) |
Retrieving the API token
The daemon generates an API token on first run and writes it to the
api_token file in the data directory. To read it later:
| Platform | Command |
|---|---|
| Linux | cat ~/.local/share/drakeling/api_token |
| macOS | cat ~/Library/Application\ Support/drakeling/api_token |
| Windows | type "%APPDATA%\drakeling\drakeling\api_token" |
You need this token for API requests (export, import) and for OpenClaw Skill configuration. See OpenClaw Skill setup.
Upgrading, uninstalling, and reinstalling
Upgrading (keep your creature)
To update the app and keep your creature data:
| Installer | Command |
|---|---|
| pipx | pipx upgrade drakeling |
| pip | pip install --upgrade drakeling |
| uv | uv tool upgrade drakeling |
Restart the daemon after upgrading.
Uninstalling
- Stop the daemon (Ctrl+C or stop the service).
- Uninstall the app:
| Installer | Command |
|---|---|
| pipx | pipx uninstall drakeling |
| pip | pip uninstall drakeling |
| uv | uv tool uninstall drakeling |
Removing creature data
To delete your creature and all local data (database, identity key, exports), remove the data directory:
| Platform | Command |
|---|---|
| Linux | rm -rf ~/.local/share/drakeling |
| macOS | rm -rf ~/Library/Application\ Support/drakeling |
| Windows | rmdir /s /q "%APPDATA%\drakeling\drakeling" |
Clean reinstall (start from scratch)
Uninstall the app, remove the data directory (commands above), then install again.
Linux / macOS (pipx):
pipx uninstall drakeling
rm -rf ~/.local/share/drakeling # Linux
# or: rm -rf ~/Library/Application\ Support/drakeling # macOS
pipx install drakeling
Windows (pipx, Command Prompt or PowerShell):
pipx uninstall drakeling
rmdir /s /q "%APPDATA%\drakeling\drakeling"
pipx install drakeling
Configuration
The daemon reads configuration from environment variables. For persistent
config, place a .env file in the data directory shown above. This is the
preferred approach because background services (systemd, launchd) do not
inherit shell profiles like ~/.bashrc.
Environment variable reference
| Variable | Description | Default |
|---|---|---|
DRAKELING_LLM_BASE_URL |
OpenAI-compatible /v1 endpoint URL |
(required unless gateway mode) |
DRAKELING_LLM_API_KEY |
API key for the LLM provider | (required unless gateway mode) |
DRAKELING_LLM_MODEL |
Model name (e.g. gpt-4o-mini, llama3.3) |
(required unless gateway mode) |
DRAKELING_USE_OPENCLAW_GATEWAY |
Delegate LLM calls to OpenClaw gateway | false |
DRAKELING_OPENCLAW_GATEWAY_URL |
Gateway URL | http://127.0.0.1:18789 |
DRAKELING_OPENCLAW_GATEWAY_TOKEN |
Bearer token for the gateway | (unset) |
DRAKELING_OPENCLAW_GATEWAY_MODEL |
Model to request from the gateway (omit to use gateway default) | (unset) |
DRAKELING_MAX_TOKENS_PER_CALL |
Per-call token cap | 300 |
DRAKELING_MAX_TOKENS_PER_DAY |
Daily token budget | 10000 |
DRAKELING_TICK_SECONDS |
Background loop interval (seconds, minimum 10) | 60 |
DRAKELING_MIN_REFLECTION_INTERVAL |
Minimum seconds between background reflections | 600 |
DRAKELING_PORT |
Daemon HTTP port | 52780 |
LLM configuration
Your creature needs an LLM provider to talk and reflect. On first run,
drakelingd walks you through setup interactively. You can also configure it
manually by editing the .env file in the data directory.
Important base URL rule:
DRAKELING_LLM_BASE_URLmust point to the provider's API root (usually ending in/v1).- Do not include
/chat/completionsinDRAKELING_LLM_BASE_URL. - Drakeling appends
/chat/completionsautomatically.
Examples:
- Correct:
http://127.0.0.1:11434/v1 - Wrong:
http://127.0.0.1:11434/v1/chat/completions
Common base URL patterns (direct provider mode):
| Provider | Base URL (DRAKELING_LLM_BASE_URL) |
|---|---|
| OpenAI | https://api.openai.com/v1 |
| Ollama (local) | http://127.0.0.1:11434/v1 |
| LM Studio (local server) | http://127.0.0.1:1234/v1 |
| vLLM (default local server) | http://127.0.0.1:8000/v1 |
| OpenRouter | https://openrouter.ai/api/v1 |
Option A — Any OpenAI-compatible LLM provider
Works with OpenAI, Ollama, vLLM, LiteLLM, or any service that exposes an
OpenAI-compatible /v1 endpoint.
DRAKELING_LLM_BASE_URL=https://api.openai.com/v1
DRAKELING_LLM_API_KEY=sk-...
DRAKELING_LLM_MODEL=gpt-4o-mini
For local LLMs (e.g. Ollama), the API key can be any non-empty string:
DRAKELING_LLM_BASE_URL=http://127.0.0.1:11434/v1
DRAKELING_LLM_API_KEY=ollama-local
DRAKELING_LLM_MODEL=llama3.3
Common model name examples (set in DRAKELING_LLM_MODEL):
- Ollama local:
qwen3:14b,llama3.3 - OpenAI:
gpt-4o-mini - OpenRouter:
openai/gpt-oss-20b - vLLM (self-hosted):
NousResearch/Meta-Llama-3-8B-Instruct
Option B — OpenClaw gateway delegation
If you already run OpenClaw, this is the easiest option. Any model OpenClaw supports (cloud or local) becomes available to Drakeling with no additional provider configuration.
DRAKELING_USE_OPENCLAW_GATEWAY=true
# DRAKELING_OPENCLAW_GATEWAY_URL= # leave blank for default http://127.0.0.1:18789
# DRAKELING_OPENCLAW_GATEWAY_TOKEN= # leave blank if gateway has no auth
# DRAKELING_OPENCLAW_GATEWAY_MODEL=openai/gpt-oss-20b
If you set DRAKELING_OPENCLAW_GATEWAY_MODEL, use a model identifier that
your OpenClaw gateway can serve (for example cloud models like
openai/gpt-oss-20b or local models exposed by your OpenClaw setup).
Troubleshooting common URL mistakes
If daemon logs show an error like:
404 Not Found ... /v1/chat/completions/chat/completions
your base URL is too specific. This usually means
DRAKELING_LLM_BASE_URL was set to include /chat/completions.
Fix:
- Set
DRAKELING_LLM_BASE_URLto the provider root only (for examplehttp://127.0.0.1:11434/v1). - Keep
/chat/completionsout of the.envvalue. - Restart
drakelingdafter updating.env.
Export and import
Export (backup)
Your creature can be exported as an encrypted .drakeling bundle file
containing the database and identity key.
curl -X POST http://127.0.0.1:52780/export \
-H "Authorization: Bearer $(cat ~/.local/share/drakeling/api_token)" \
-H "Content-Type: application/json" \
-d '{"passphrase": "your-secret-passphrase", "output_path": "/tmp/my-dragon.drakeling"}'
Import (restore / migrate)
To import a bundle onto a new machine, start the daemon in import-ready mode:
drakelingd --allow-import
Then send the import request:
curl -X POST http://127.0.0.1:52780/import \
-H "Authorization: Bearer $(cat ~/.local/share/drakeling/api_token)" \
-H "Content-Type: application/json" \
-d '{"passphrase": "your-secret-passphrase", "bundle_path": "/tmp/my-dragon.drakeling"}'
The daemon creates a .bak backup before importing and rolls back automatically
if anything goes wrong. After a successful import, restart the daemon normally
(without --allow-import).
CLI reference
drakelingd
| Flag | Description |
|---|---|
| (no flags) | Normal production mode |
--dev |
Development mode: verbose stdout logging, no background reflection, import always permitted |
--allow-import |
Enable the POST /import endpoint (disabled by default for safety) |
drakeling
No flags. Connects to the local daemon and launches the interactive terminal UI.
Running as a service
For production use, the daemon should run as a background service that starts
automatically on login. Template files are provided in deploy/.
Linux — systemd
cp deploy/drakeling.service ~/.config/systemd/user/
systemctl --user daemon-reload
systemctl --user enable --now drakeling
Check status: systemctl --user status drakeling
macOS — launchd
cp deploy/drakeling.plist ~/Library/LaunchAgents/
launchctl load ~/Library/LaunchAgents/drakeling.plist
Windows — Task Scheduler
schtasks /create /tn "Drakeling" /tr "drakelingd" /sc onlogon /rl limited /f
Or import deploy/drakeling-task.xml via the Task Scheduler GUI.
OpenClaw Skill setup
This lets OpenClaw agents check on your drakeling and give it care autonomously.
- Install the skill:
clawhub install drakeling(or copyskill/to~/.openclaw/skills/drakeling/) - Start the daemon at least once:
drakelingd - Read the API token:
- Linux:
cat ~/.local/share/drakeling/api_token - macOS:
cat ~/Library/Application\ Support/drakeling/api_token - Windows:
type "%APPDATA%\drakeling\drakeling\api_token"
- Linux:
- Add to
~/.openclaw/openclaw.jsonunderskills.entries.drakeling:{ "skills": { "entries": { "drakeling": { "env": { "DRAKELING_API_TOKEN": "paste-token-here" } } } } }
See docs/openclaw_integration.md for the full OpenClaw integration guide (config format, gateway delegation, and references).
The skill only uses /status (read) and /care (write). It never calls
/talk, /rest, /export, or /import.
Development
Setup
git clone https://github.com/BVisagie/drakeling.git
cd drakeling
Using pip:
python3 -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]"
Using pipx:
pipx install --editable .
Using uv:
uv venv
source .venv/bin/activate
uv pip install -e ".[dev]"
Running in dev mode
drakelingd --dev
Dev mode:
- Logs all lifecycle events and token usage to stdout
- Disables background reflection (tick loop still runs for stat decay)
- Permits import without
--allow-import
Running tests
pytest
The test suite covers domain models, trait generation, stat decay/boost, lifecycle transitions, crypto (identity, tokens, encrypted bundles), sprites, and API integration tests.
Project structure
src/drakeling/
domain/ Pure domain logic (models, traits, decay, lifecycle, sprites)
crypto/ Ed25519 identity, API tokens, encrypted bundles
storage/ SQLAlchemy models and database init
llm/ LLM wrapper and prompt construction
daemon/ Daemon entry point, config, background tick loop
api/ FastAPI endpoints (birth, status, care, talk, rest, export/import)
ui/ Textual terminal UI (birth ceremony, main screen, widgets)
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file drakeling-0.1.6.tar.gz.
File metadata
- Download URL: drakeling-0.1.6.tar.gz
- Upload date:
- Size: 2.0 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9a813abe6f33902de91e6770951a774871143d646f5331212a43ae3c004d121b
|
|
| MD5 |
c43b97ce71d1a185959954f378bb99f0
|
|
| BLAKE2b-256 |
24c02d29d969af7a8a74ddb8bcd5c76be48f25471d19a8573e5286336ce56b45
|
File details
Details for the file drakeling-0.1.6-py3-none-any.whl.
File metadata
- Download URL: drakeling-0.1.6-py3-none-any.whl
- Upload date:
- Size: 61.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bfb907e1d0d15132bf06dd6fd40ed42728a91a535bcde6eaf388318b42f24472
|
|
| MD5 |
a1c8b8f6eb45c4f1f612d5692ac20314
|
|
| BLAKE2b-256 |
d970d3202ad3bcf403f4d6e050079a1f7dea30ddc55e013c8f4de034a6baf4e5
|