Full Stack Agentic AI Framework
Project description
🚀 SuperOptiX AI
Full Stack Agentic AI Framework
Evaluation-First • Optimization-Core • Orchestration-Ready
🎯 Quick Start
# Install SuperOptiX
pip install superoptix
# Or with uv
uv pip install superoptix
# Install GPT-OSS models (OpenAI's latest open-source models)
super model install gpt-oss:20b # Advanced reasoning
super model install gpt-oss:120b # Production-level reasoning
# Follow the steps
super docs
## ⚠️ Important Note: CrewAI Dependency Conflict
**CrewAI has a known dependency conflict** with SuperOptiX due to incompatible `json-repair` version requirements:
- **DSPy 3.0.0** requires `json-repair>=0.30.0`
- **CrewAI 0.157.0** requires `json-repair==0.25.2`
**To use CrewAI with SuperOptiX, install it manually:**
```bash
# 1. Install SuperOptiX with DSPy support
pip install "superoptix[optimas]"
# 2. Install CrewAI without dependencies
pip install crewai==0.157.0 --no-deps
# 3. Ensure compatible json-repair version
pip install "json-repair>=0.30.0"
See our Installation Guide for more details.
## 🤖 GPT-OSS Model Support
SuperOptiX now supports **OpenAI's latest open-source models** with **native Apple Silicon support**:
- **GPT-OSS-20B**: 21B parameters (3.6B active) - Perfect for advanced reasoning
- **GPT-OSS-120B**: 117B parameters (5.1B active) - Production-ready for complex tasks
**🍎 Apple Silicon Support:**
- **MLX-LM v0.26.3**: Native Apple Silicon support with proper MXFP4 quantization
- **Ollama GGUF**: Optimized format for best performance (19.7 t/s)
- **Multiple Backend Options**: Choose between MLX (native) and Ollama (performance)
**Key Features:**
- 🔓 **Apache 2.0 License** - Build freely without restrictions
- ⚡ **Native MXFP4 Quantization** - Optimized for efficient inference
- 🍎 **Apple Silicon Native** - No more mixed precision issues
**Usage:**
```bash
# MLX-LM (native Apple Silicon support)
super model run openai_gpt-oss-20b "prompt" --backend mlx
# Ollama (best performance)
super model run gpt-oss:20b "prompt" --backend ollama
Resources:
- GPT-OSS-120B Model - HuggingFace repository
- GPT-OSS-20B Model - HuggingFace repository
- Ollama Library - Ollama model library
📚 Learn More
| Resource | Description | Link |
|---|---|---|
| 🌐 Website | Learn about our vision and solutions | superoptix.ai |
| 📖 GitHub | Source code and project repository | @SuperagenticAI/superoptix-ai |
| 📦 PyPI | Install via pip | superoptix |
🆘 Support
📄 License
This project is licensed under a proprietary license. For licensing inquiries, contact licensing@super-agentic.ai.
🚀 Ready to Build the Future?
Start with SuperOptiX • Read the Docs • Join the Revolution
Powered by DSPy. Refined by Superagentic AI.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file superoptix-0.1.0b13-py3-none-any.whl.
File metadata
- Download URL: superoptix-0.1.0b13-py3-none-any.whl
- Upload date:
- Size: 622.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.11.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f5c0da974d5391eca218604fa96ee4bc998bbfa40713f9282bf7594e9f7d4ef8
|
|
| MD5 |
eb655d6cae22751c1426e546eca392c6
|
|
| BLAKE2b-256 |
bdb3544c52314f987ca13b4b3bdafda8f9b0954a7a09e9d7f93c5e8eaa11ae95
|