Wishful thinking for Python
Project description
wishful 🪄
"Code so good, you'd think it was wishful thinking"
Stop writing boilerplate. Start wishing for it instead.
wishful turns your wildest import dreams into reality. Just write the import you wish existed, and an LLM conjures up the code on the spot. The first run? Pure magic. Every run after? Blazing fast, because it's cached like real Python.
Think of it as wishful thinking, but for imports. The kind that actually works.
✨ Quick Wish
1. Install the dream
pip install wishful
2. Set your credentials (litellm reads the usual suspects)
Export them or toss them in a .env file—we'll find them:
export OPENAI_API_KEY=...
export DEFAULT_MODEL=azure/gpt-4.1
``
or
```bash
export AZURE_API_KEY=...
export AZURE_API_BASE=https://<your-endpoint>.openai.azure.com/
export AZURE_API_VERSION=2025-04-01-preview
export DEFAULT_MODEL=azure/gpt-4.1
or any provider else supported by litellm
3. Import your wildest fantasies
from wishful.text import extract_emails
from wishful.dates import to_yyyy_mm_dd
raw = "Contact us at team@example.com or sales@demo.dev"
print(extract_emails(raw)) # ['team@example.com', 'sales@demo.dev']
print(to_yyyy_mm_dd("31.12.2025")) # '2025-12-31'
What just happened?
- First import: wishful waves its wand 🪄, asks the LLM to write
extract_emailsandto_yyyy_mm_dd, validates the code for safety, and caches it to.wishful/text.pyand.wishful/dates.py. - Every subsequent run: instant. Just regular Python imports. No latency, no drama, no API calls.
It's like having a junior dev who never sleeps and always delivers exactly what you asked for (well, almost always).
🎯 Wishful Guidance: Help the AI Read Your Mind
Want better results? Drop hints. Literal comments. wishful reads the code around your import and forwards that context to the LLM.
# desired: parse standard nginx combined logs into list of dicts
from wishful.logs import parse_nginx_logs
records = parse_nginx_logs(Path("/var/log/nginx/access.log").read_text())
The AI sees your comment and knows exactly what you're after. It's like pair programming, but your partner is a disembodied intelligence with questionable opinions about semicolons.
🗄️ Cache Ops: Because Sometimes Wishes Need Revising
import wishful
# See what you've wished for
wishful.inspect_cache() # ['.wishful/text.py', '.wishful/dates.py']
# Regret a wish? Regenerate it
wishful.regenerate("wishful.text") # Next import re-generates from scratch
# Nuclear option: forget everything
wishful.clear_cache() # Deletes the entire .wishful/ directory
The cache is just regular Python files in .wishful/. Want to tweak the generated code? Edit it directly. It's your wish, after all.
⚙️ Configuration: Fine-Tune Your Wishes
import wishful
wishful.configure(
model="gpt-4o-mini", # Switch models like changing channels
cache_dir="/tmp/.wishful", # Hide your wishes somewhere else
spinner=False, # Silence the "generating..." spinner
review=True, # Paranoid? Review code before it runs
allow_unsafe=False, # Keep the safety rails ON (recommended)
)
Environment Variables (for the env-obsessed)
Set these in your shell or .env file:
WISHFUL_MODEL/DEFAULT_MODEL— which AI overlord to summonWISHFUL_CACHE_DIR— where to stash generated wishes (default:.wishful)WISHFUL_REVIEW— set to1to manually approve every wish (trust issues?)WISHFUL_DEBUG— verbose logging for when things go sidewaysWISHFUL_UNSAFE— set to1to disable safety checks (⚠️ danger zone)WISHFUL_SPINNER— set to0to disable the fancy spinnerWISHFUL_MAX_TOKENS— cap the LLM's verbosity (default: 800)WISHFUL_TEMPERATURE— creativity dial (default: 0 = boring but safe)
🛡️ Safety Rails: Wishful Isn't That Reckless
We're not complete anarchists here. Generated code gets AST-scanned to block obviously dangerous patterns:
- ❌ Imports like
os,subprocess,sys - ❌ Calls to
eval()orexec() - ❌
open()in write/append mode - ❌ Shenanigans like
os.system()orsubprocess.call()
Override at your own peril: WISHFUL_UNSAFE=1 or allow_unsafe=True turns off the guardrails. We won't judge. (We will totally judge.)
🧪 Testing: Wishes Without Consequences
Need deterministic, offline behavior? Set WISHFUL_FAKE_LLM=1 and wishful will generate placeholder stub functions instead of hitting the network.
Perfect for CI, unit tests, or when your Wi-Fi is acting up.
export WISHFUL_FAKE_LLM=1
python my_tests.py # No API calls, just predictable stubs
🔮 How the Magic Actually Works
Here's the 30-second version:
- Import hook: wishful installs a
MagicFinderonsys.meta_paththat interceptswishful.*imports. - Cache check: If
.wishful/<module>.pyexists, it loads instantly. No AI needed. - LLM generation: If not cached, wishful calls the LLM (via
litellm) to generate the code based on your import and surrounding context. - Validation: The generated code is AST-parsed and safety-checked (unless you disabled that like a madman).
- Execution: Code is written to
.wishful/, compiled, and executed as the import result. - Transparency: The cache is just plain Python files. Edit them. Commit them. They're yours.
It's import hooks meets LLMs meets "why didn't this exist already?"
🎭 Fun with Wishful Thinking
# Need some cosmic horror? Just wish for it.
from wishful.story import cosmic_horror_intro
intro = cosmic_horror_intro(
setting="a deserted amusement park",
word_count_at_least=100
)
print(intro) # 🎢👻
# Math that writes itself
from wishful.numbers import primes_from_to, sum_list
total = sum_list(list=primes_from_to(1, 100))
print(total) # 1060 (probably)
# Because who has time to write date parsers?
from wishful.dates import parse_fuzzy_date
print(parse_fuzzy_date("next Tuesday")) # Your guess is as good as mine
🤔 FAQ (Frequently Asked Wishes)
Q: Is this production-ready?
A: Define "production." 🙃
Q: What if the LLM generates bad code?
A: That's what the cache is for. Check .wishful/, tweak it, commit it, and it's locked in.
Q: Can I use this with OpenAI/Claude/local models?
A: Yep! We use litellm, so anything it supports, we support.
Q: What if I import something that doesn't make sense?
A: The LLM will do its best. Results may vary. Hilarity may ensue.
Q: Is this just lazy programming?
A: It's not lazy. It's efficient wishful thinking. 😎
📜 License
MIT. Wish responsibly.
Go forth and wish. ✨
Your imports will never be the same.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file wishful-0.1.0.tar.gz.
File metadata
- Download URL: wishful-0.1.0.tar.gz
- Upload date:
- Size: 11.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.11 {"installer":{"name":"uv","version":"0.9.11"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e47bbc96480c86c4a9ef3cecbed70433f14b830310f8c7ac19ef4057e839e213
|
|
| MD5 |
d33fc261fcca10565ed338e776cde324
|
|
| BLAKE2b-256 |
61f381721e1b488685ce6c45ec243296f88fff1b5903f2fa6705b6c141c02007
|
File details
Details for the file wishful-0.1.0-py3-none-any.whl.
File metadata
- Download URL: wishful-0.1.0-py3-none-any.whl
- Upload date:
- Size: 16.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.11 {"installer":{"name":"uv","version":"0.9.11"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"macOS","version":null,"id":null,"libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
732698a0b27f5a40dec461104fda547a745643166b047aab897fa191dd695471
|
|
| MD5 |
264576c73aff5b2a81e4600792718f25
|
|
| BLAKE2b-256 |
0663947939514ddca6c1a99bec4960c9c8edee7756b53c6aa5d610e9233d5b4b
|