Background jobs for Python — no broker, no infrastructure
Project description
Viscacha
Background jobs for Python. Built for AI pipelines — every job crash-safe, traceable, and retriable.
from viscacha import Client, Worker
client = Client()
worker = Worker(client)
@worker.job("greet")
def greet(name: str) -> dict:
return {"message": f"Hello, {name}!"}
worker.run(blocking=False)
handle = client.enqueue("greet", name="Alice")
result = handle.wait()
print(result.result) # {'message': 'Hello, Alice!'}
No broker. No Redis. No Docker. Just Python.
Install
pip install uv
uv pip install -e .
Requires Python 3.10+.
How it works
- Submit a job
- A worker function runs it
- Get the result or inspect what happened
handle = client.enqueue("send_email", to="alice@example.com")
result = handle.wait(timeout=30) # raises TimeoutError if it doesn't finish
print(result.status) # 'done' | 'failed' | 'cancelled'
print(result.result) # return value of the job function
print(result.error) # set if failed, else None
handle.cancel() # cancel a pending job
client.jobs() # list all jobs
client.jobs(status="done") # filter by status
client.get(handle.id) # get one by ID
Guarantees
- No lost jobs — a job stays in the queue until a worker completes it
- Safe retries — transient failures retry automatically
- Full traceability — every job logged with type, args, result, retries, error
- Crash-safe — if a worker dies mid-job, the lease expires and the job returns to the queue
AI pipelines
Each Claude call is a job. Workers run in parallel. Failures retry automatically.
import anthropic
from viscacha import Client, Worker
client = Client()
worker = Worker(client)
ai = anthropic.Anthropic()
@worker.job("classify_ticket", max_retries=2)
def classify_ticket(title: str, body: str) -> dict:
response = ai.messages.create(
model="claude-haiku-4-5-20251001",
max_tokens=120,
messages=[{"role": "user", "content": f"Classify: {title}\n{body}"}],
)
return {"category": "bug", "priority": "high"}
worker.run(blocking=False)
handles = [client.enqueue("classify_ticket", title=t, body=b) for t, b in tickets]
results = [h.wait(timeout=30) for h in handles]
ANTHROPIC_API_KEY=sk-... python demos/demo_ai_jobs.py
Any function works
Email, HTTP calls, reports, transforms — a worker is just a function.
@worker.job("send_email")
def send_email(to: str, subject: str, html: str) -> dict:
return {"to": to, "sent": True}
client.enqueue("send_email", to="bob@example.com", subject="Order confirmed", html="...")
python demos/demo_email_jobs.py # dry-run, no SMTP needed
Retries and crash recovery
@worker.job("call_api", max_retries=5, lease_ttl=60.0)
def call_api(endpoint: str) -> dict:
response = requests.get(endpoint, timeout=10)
response.raise_for_status()
return response.json()
max_retries — retries on any exception (default 3)
lease_ttl — seconds before a stalled job is reclaimed (default 30)
Persistence
client = Client(log_path="jobs.jsonl")
Append-only log. Jobs survive restarts.
HTTP API
Expose jobs over HTTP so workers can run anywhere:
from viscacha import Client
from viscacha.server import create_app
import uvicorn
app = create_app(Client(log_path="jobs.jsonl"))
uvicorn.run(app, host="0.0.0.0", port=8000)
curl -X POST http://localhost:8000/jobs \
-H "Content-Type: application/json" \
-d '{"job_type": "greet", "args": {"name": "Alice"}}'
curl http://localhost:8000/jobs?status=done
Under the hood
Jobs are tuples in an append-only tuple space. Workers claim jobs via leases — if a worker crashes, the lease expires and the job returns to the queue automatically. The coordination layer handles ordering, crash safety, and observability. Viscacha is a thin API on top.
Roadmap
- Priority queues
- Job chaining / workflows
- Web dashboard
- Scheduled / cron jobs
- Distributed workers (multi-process, multi-host)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file viscacha-0.1.0-py3-none-any.whl.
File metadata
- Download URL: viscacha-0.1.0-py3-none-any.whl
- Upload date:
- Size: 28.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e74e6452686d3f0b37e096fbaa09092ddca9eedc7e700627bf0d64a0ae0b283a
|
|
| MD5 |
ee6049a505d1cb8f596f89afe0951b84
|
|
| BLAKE2b-256 |
491b796250854508ccc2081f4949061e2291a6ff0106f8564606dcdc3a454c3c
|