SOTA Omni-Modal Personal AI Orchestrator & Engine
Project description
📦 Xorfice: The SOTA Omni-Modal Intelligence Engine
Xorfice is the official, high-performance orchestration engine for Xoron-Dev. It provides a secure, production-grade interface for the world's most advanced 5B Sparse Mixture of Experts (MoE) model, enabling seamless reasoning across Text, Vision, Video, and Audio.
🚀 Why Xorfice?
Xorfice is designed for developers who demand state-of-the-art multimodal performance without the complexity of manual model management.
🛡️ Enterprise-Grade Trust
- Official Interface: Developed and maintained by the Backup-bdg team.
- Privacy First: All multimodal processing (Vision, Audio, Video) occurs locally on your hardware.
- Validated Weights: Automatic checksum verification for all model weights downloaded from HuggingFace.
✨ SOTA Features
- 🧠 Sparse MoE Orchestration: Native support for 8-expert routing with Deep Expert invocation (depths up to 5) for complex reasoning tokens.
- ⚡ Fast Ponder Latents: Attention-free depth-3 reasoning block for 10x-20x thought acceleration.
- 👁️ SigLIP-2 & TiTok Vision: Built on SigLIP-2 for superior zero-shot alignment and 2.2x token compression.
- 🎬 VidTok Video Logic: 3D volumetric compression for understanding motion and causality.
- 🎙️ Raw PCM Audio: Direct Conformer-based ingestion of 16kHz audio for sub-200ms Speech-to-Speech latency.
- 🎨 Creative Power: Integrated pipelines for Text-to-Video (T2V) and Image-to-Video (I2V).
🛠️ Installation
Simply install the latest stable version from PyPI:
pip install xorfice
💡 Quick Start
Get up and running with the XoronEngine in seconds.
from xorfice import XoronEngine
# The engine automatically handles hardware optimization (CUDA/VRAM)
# Weights are verified and cached from Backup-bdg/Xoron-Dev-MultiMoe
engine = XoronEngine(model_path="Backup-bdg/Xoron-Dev-MultiMoe")
# Multimodal Reasoning
response = engine.generate(
prompt="Analyze the speaker's emotions in this video.",
videos="https://example.com/interview.mp4",
audios="https://example.com/interview_audio.wav"
)
print(f"Xoron: {response['text']}")
⚙️ Performance & Optimization
Xorfice includes industry-leading optimization techniques:
- Expert Offloading: Run 5B+ parameter models on 8GB VRAM consumer GPUs.
- Paged KV Cache: Massive throughput for long-context (128K) reasoning.
- Adaptive Precision: Automatic switching between FP16 and BF16 based on hardware capability.
🤝 Community & Support
- Model Hub: Backup-bdg on HuggingFace
- Documentation: Accessible locally at http://localhost:8000 (Documentation Tab) while the engine is running.
Xorfice: Powering the next generation of omni-modal agents.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file xorfice-0.1.43.tar.gz.
File metadata
- Download URL: xorfice-0.1.43.tar.gz
- Upload date:
- Size: 343.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ccc3cf529eef61023dcaadb55b4746510573dcd2d9c0039710c025fc9ed99ac9
|
|
| MD5 |
ec9f635af958dd7d4af93525c3b719b4
|
|
| BLAKE2b-256 |
3d5acc5e333e65eb1633a9cfb631979ac88a6f02cd0168e5a064f66d68727ff2
|
File details
Details for the file xorfice-0.1.43-py3-none-any.whl.
File metadata
- Download URL: xorfice-0.1.43-py3-none-any.whl
- Upload date:
- Size: 359.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4a75ab53738d688dbf06430be5a340885f156b3b8c32cc180ab447567c1b5355
|
|
| MD5 |
22fa44fdc97181daa3837075dd0d7eae
|
|
| BLAKE2b-256 |
a0d513426f510df957cb29bc61c5e9772682600f3799079ab3793f69e78f2e45
|