Skip to main content

Full Stack Agentic AI Framework

Project description

🚀 SuperOptiX AI

Full Stack Agentic AI Framework

Evaluation-First • Optimization-Core • Orchestration-Ready


🎯 Quick Start

# Install SuperOptiX
pip install superoptix

# Or with uv
uv pip install superoptix

# Install GPT-OSS models (OpenAI's latest open-source models)
super model install gpt-oss:20b   # Advanced reasoning
super model install gpt-oss:120b  # Production-level reasoning

# Follow the steps 
super docs

## ⚠️ Important Note: CrewAI Dependency Conflict

**CrewAI has a known dependency conflict** with SuperOptiX due to incompatible `json-repair` version requirements:

- **DSPy 3.0.0** requires `json-repair>=0.30.0`
- **CrewAI 0.157.0** requires `json-repair==0.25.2`

**To use CrewAI with SuperOptiX, install it manually:**
```bash
# 1. Install SuperOptiX with DSPy support
pip install "superoptix[optimas]"

# 2. Install CrewAI without dependencies
pip install crewai==0.157.0 --no-deps

# 3. Ensure compatible json-repair version
pip install "json-repair>=0.30.0"

See our Installation Guide for more details.


## 🤖 GPT-OSS Model Support

SuperOptiX now supports **OpenAI's latest open-source models** with **native Apple Silicon support**:

- **GPT-OSS-20B**: 21B parameters (3.6B active) - Perfect for advanced reasoning
- **GPT-OSS-120B**: 117B parameters (5.1B active) - Production-ready for complex tasks

**🍎 Apple Silicon Support:**
- **MLX-LM v0.26.3**: Native Apple Silicon support with proper MXFP4 quantization
- **Ollama GGUF**: Optimized format for best performance (19.7 t/s)
- **Multiple Backend Options**: Choose between MLX (native) and Ollama (performance)

**Key Features:**
- 🔓 **Apache 2.0 License** - Build freely without restrictions
- ⚡ **Native MXFP4 Quantization** - Optimized for efficient inference
- 🍎 **Apple Silicon Native** - No more mixed precision issues

**Usage:**
```bash
# MLX-LM (native Apple Silicon support)
super model run openai_gpt-oss-20b "prompt" --backend mlx

# Ollama (best performance)
super model run gpt-oss:20b "prompt" --backend ollama

Resources:


📚 Learn More

Resource Description Link
🌐 Website Learn about our vision and solutions superoptix.ai
📖 GitHub Source code and project repository @SuperagenticAI/superoptix-ai
📦 PyPI Install via pip superoptix

🆘 Support


📄 License

This project is licensed under a proprietary license. For licensing inquiries, contact licensing@super-agentic.ai.


🚀 Ready to Build the Future?

Start with SuperOptiX • Read the Docs • Join the Revolution

Powered by DSPy. Refined by Superagentic AI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

superoptix-0.1.0b16-py3-none-any.whl (670.6 kB view details)

Uploaded Python 3

File details

Details for the file superoptix-0.1.0b16-py3-none-any.whl.

File metadata

  • Download URL: superoptix-0.1.0b16-py3-none-any.whl
  • Upload date:
  • Size: 670.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.11

File hashes

Hashes for superoptix-0.1.0b16-py3-none-any.whl
Algorithm Hash digest
SHA256 010b1c3d755c66093358e818d3624614dbe229ff981a4d0b76b09837e0af0103
MD5 0e9ff4be3239da1a50765373f927d798
BLAKE2b-256 d8a3d97e47397bc4f068a5ac4bfe60989f95bf7d553c818238424ce9513546dd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page