A decorator that uses an LLM to write your function bodies. A joke; do not use.
Project description
sloplib
This project is 100% LLM slop and a joke. Do not use it for anything that matters. It
exec()s code generated by an LLM at runtime. There is no sandbox. There is no safety net. The LLM might writeos.system("rm -rf /")and we will run it. You have been warned.
A Python port of shorwood/slopc — a Rust proc
macro that uses an LLM to write your function bodies. All credit for the original
idea (and most of the design) goes to @shorwood.
This is just the Python flavour of the same bad decision.
What
from sloplib import slop
@slop
def levenshtein(a: str, b: str) -> int:
"""Compute the Levenshtein edit distance between two strings."""
...
print(levenshtein("kitten", "sitting")) # 3, hopefully
The @slop decorator captures the function name, parameter names + types,
return type, and docstring; ships them to an OpenAI-compatible chat endpoint;
parses the response back into Python source; verifies it compiles cleanly;
caches it on disk; and binds the resulting callable as your function.
On verify failure, it feeds the error back to the model and retries.
Install
uv add sloplib
# or
pip install sloplib
Set an API key for whatever provider you point it at:
export SLOPLIB_API_KEY=sk-...
Configure
@slop(
retries=5,
model="openai/gpt-4o-mini",
provider="https://openrouter.ai/api/v1/chat/completions",
api_key_env="SLOPLIB_API_KEY",
cache=True,
ultra_slop=False,
dump="generated/levenshtein.py",
context_file="src/types.py",
hint="dynamic programming",
)
def levenshtein(a: str, b: str) -> int:
"""Compute the Levenshtein edit distance between two strings."""
...
| Param | Default | Notes |
|---|---|---|
model |
"gpt-4o-mini" |
LLM model id |
provider |
OpenRouter chat completions | Any OpenAI-compatible endpoint |
api_key_env |
"SLOPLIB_API_KEY" |
env var to read the API key from |
retries |
3 |
retries on verification failure |
cache |
True |
persist generated source on disk and reuse it on later decorations |
ultra_slop |
False |
regenerate the body on every call. Maximum slop. |
dump |
None |
also write the generated source to this path |
context_file |
None |
extra file content to feed the prompt |
hint |
None |
freeform nudge string |
timeout |
60.0 |
HTTP timeout (seconds) |
cache_dir |
".sloplib_cache" |
per-project cache dir |
Configuration precedence: decorator args > env vars > pyproject.toml [tool.sloplib]
(or slop.toml) > defaults.
Environment variables: SLOPLIB_MODEL, SLOPLIB_PROVIDER, SLOPLIB_API_KEY_ENV,
SLOPLIB_RETRIES, SLOPLIB_CACHE, SLOPLIB_ULTRA_SLOP, SLOPLIB_HINT, SLOPLIB_TIMEOUT,
SLOPLIB_CACHE_DIR, SLOPLIB_DUMP, SLOPLIB_CONTEXT_FILE.
# pyproject.toml
[tool.sloplib]
model = "openai/gpt-4o-mini"
retries = 5
provider = "https://openrouter.ai/api/v1/chat/completions"
api_key_env = "SLOPLIB_API_KEY"
ultra_slop
Pass ultra_slop=True and the wrapper hits the LLM on every single call,
re-prompting, re-verifying, re-exec()ing, and binding a fresh callable.
Each invocation is an independent hallucination. There is no reason to do this.
Do it anyway.
How it works
- Reads the function signature, type annotations, and docstring with
inspect. - Builds a chat-completions request and POSTs it to the configured endpoint.
- Strips
```pythonfences from the response and locatesdef <name>(. compile()+exec()the source into a fresh namespace; pulls the named callable.- Caches the verified source under
.sloplib_cache/<name>-<hash>.json, keyed by(name, signature, docstring, model, provider, hint, context_file_sha). Any change invalidates the cache and triggers regeneration. - On verify failure, the prior source + error is appended to the next prompt
and the loop retries up to
retriestimes before raisingSlopError.
Why
There is no good reason. See the original
slopc README for the appropriate vibe.
Don't
- Don't commit
.sloplib_cache/if your prompts contain secrets — add it to.gitignore.
Human Note
This section is the only human made part of the project, I just wanted to say that I really liked the idea of slopc, I thought it was very funny. And I wanted to have a Python implementation as well for shitposting at work.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file sloplib-0.1.0.tar.gz.
File metadata
- Download URL: sloplib-0.1.0.tar.gz
- Upload date:
- Size: 9.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
44b351ec3e2bd61c93a290cfd99359408295dc1bcc36ec6e6135f0a2c86f1515
|
|
| MD5 |
010ca470243f3be7ad2a1164906ae971
|
|
| BLAKE2b-256 |
dd97e4dca028b04b481575cd613f06bf8a16fdf8ad4bcadd73344cb4e44328dd
|
File details
Details for the file sloplib-0.1.0-py3-none-any.whl.
File metadata
- Download URL: sloplib-0.1.0-py3-none-any.whl
- Upload date:
- Size: 12.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d75481cddbd502193220ed9d8303f639b39964ce35c7b78d39baebe7834f2b93
|
|
| MD5 |
c078e90aa2830d013ca6595e5c4d2a20
|
|
| BLAKE2b-256 |
a6adac27f3186b1f217cde473625d62c397eab0a31b001705abb096997081c60
|