Last released Mar 14, 2026
The unified LLM runtime — local inference, API proxy, and monitoring. A powerful alternative to Ollama + LiteLLM, built in Rust.
Last released Mar 11, 2026
Ultra-Powerful Personal AI Agent with Multi-Device Control, 100+ LLM Provider support via litellm, and 30+ tools
Supported by