Sleuth: use an RLM to figure out your log files. Git bisect for production incidents.
Project description
Sleuth
Use an RLM to figure out your log files.
Git bisect for production incidents. Turn 4 hours of Splunk scrollback into a 90-second RCA draft.
Runs locally. BYO model. Apache-2.0.
Live demo → · Docs · Quickstart
Try it
pip install sleuth-rlm
export ANTHROPIC_API_KEY=...
sleuth ask "why did checkout fail around 3am?" --logs ./logs/
What you get
Every run writes one case.sleuth.json: the question, every step the agent took, the root cause it landed on, the log lines it cited, a confidence score. Self-contained, replayable, small enough to paste into a Slack thread.
checkout-worker is still presenting stripe_api_key version 6 to payment-gateway after vault rotated the secret to version 7 at 02:58:04 UTC. payment-gateway subscribed to the rotation webhook and flipped to v7 instantly… checkout-worker did not subscribe and caches the secret in-process with no SIGHUP handler.
That's a real root cause from examples/checkout-incident/. Open it in the viewer →
How it works
A DSPy RLM-style agent with 5 read-only query tools over a DuckDB event store, plus a side LLM for judgement calls. Writes Python. Runs it sandboxed. Iterates. Terminates by submitting an IncidentReport or blowing the budget.
Full tool surface
schema()— services, levels, time window, row counttop_errors(limit=20)— loudest failure modessearch(pattern, limit=10)— substring acrossmsgandrawaround(ts, window_s=60, service=None)— what happened next to this timestamptrace(trace_id)— follow one request across servicesllm_query(question, context="")— ask a secondary LLM a judgement questionsubmit_incident_report(report)— terminal, validated against schema
The loop: LLM emits Python → we exec it → feed stdout back → repeat until submit_incident_report or budget exhaustion.
Works with
- Log platforms — auto-detects exports from Splunk, Datadog, New Relic, Honeycomb. Click Export, point Sleuth at the file, done. Or feed it raw
.jsonl/.log/.gz. Integration guides → - Models — anything LiteLLM speaks: Claude, GPT, Llama, Bedrock, Ollama. One flag, no lock-in.
- Security — secrets redacted at ingest (Bearer, Stripe, AWS, GitHub, Slack, JWT, SSH) before anything touches the LLM. Logs never leave your box.
Docs
exorust.github.io/sleuth · Live viewer · Reference incident · PyPI
If Sleuth saves you a postmortem, give it a star. That's how I know it's worth pushing.
Apache-2.0.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file sleuth_rlm-0.0.1.tar.gz.
File metadata
- Download URL: sleuth_rlm-0.0.1.tar.gz
- Upload date:
- Size: 516.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ba1240b83f0967ce418aec8a918d9854eb9248196643f5920a1f7b4c6a5e71a1
|
|
| MD5 |
8d63e76a01c9a57871f829ea338e7b9d
|
|
| BLAKE2b-256 |
b05b5534e3fcf4b1484be8d9d065880bd3a4c9546b9b4d27fe5f70982ea9e1ac
|
Provenance
The following attestation bundles were made for sleuth_rlm-0.0.1.tar.gz:
Publisher:
release.yml on Exorust/sleuth
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
sleuth_rlm-0.0.1.tar.gz -
Subject digest:
ba1240b83f0967ce418aec8a918d9854eb9248196643f5920a1f7b4c6a5e71a1 - Sigstore transparency entry: 1340659720
- Sigstore integration time:
-
Permalink:
Exorust/sleuth@e87a0a0fe1273d6842378e7315e618b17fdd8ef0 -
Branch / Tag:
refs/tags/v0.0.1 - Owner: https://github.com/Exorust
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@e87a0a0fe1273d6842378e7315e618b17fdd8ef0 -
Trigger Event:
push
-
Statement type:
File details
Details for the file sleuth_rlm-0.0.1-py3-none-any.whl.
File metadata
- Download URL: sleuth_rlm-0.0.1-py3-none-any.whl
- Upload date:
- Size: 24.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ab00211ef20235c3418b1a2e0beb51e88d4c2128a1a1f1f1169706e5402abcbb
|
|
| MD5 |
f1f7b60db8e46f8d7fc4b1ff6d2c7584
|
|
| BLAKE2b-256 |
fcbdef52d51fee9a951d22a8c764e3d91b6f90bc55945c33c20e961530d1a0bd
|
Provenance
The following attestation bundles were made for sleuth_rlm-0.0.1-py3-none-any.whl:
Publisher:
release.yml on Exorust/sleuth
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
sleuth_rlm-0.0.1-py3-none-any.whl -
Subject digest:
ab00211ef20235c3418b1a2e0beb51e88d4c2128a1a1f1f1169706e5402abcbb - Sigstore transparency entry: 1340659721
- Sigstore integration time:
-
Permalink:
Exorust/sleuth@e87a0a0fe1273d6842378e7315e618b17fdd8ef0 -
Branch / Tag:
refs/tags/v0.0.1 - Owner: https://github.com/Exorust
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@e87a0a0fe1273d6842378e7315e618b17fdd8ef0 -
Trigger Event:
push
-
Statement type: