Last released Apr 19, 2026
Run local LLMs from Python. LangChain-compatible. llama.cpp + MLX backends.
Supported by