Skip to main content

Matrx AI Engine — unified AI orchestration backend with multi-provider LLM support, streaming, tool execution, and conversation persistence.

Project description

matrx-ai

Matrx AI Engine — unified AI orchestration backend with multi-provider LLM support, streaming, tool execution, and conversation persistence.

Installation

# Core library only (AI providers, tools, conversation management)
pip install matrx-ai

# With the FastAPI server layer
pip install "matrx-ai[server]"

# Or with uv
uv add matrx-ai
uv add "matrx-ai[server]"

Quick Start (library)

import matrx_ai

# Initialize once at startup (registers the database connection)
matrx_ai.initialize()

# Use the AI engine directly
from matrx_ai.orchestrator import execute_ai_request
from matrx_ai.config.unified_config import UnifiedConfig

Quick Start (server)

uv sync --extra server   # install with server deps
cp .env.example .env     # fill in your API keys
make dev                 # start dev server on :8000

Environment Variables

Copy .env.example to .env and fill in the required keys:

  • AI provider keys: OPENAI_API_KEY, ANTHROPIC_API_KEY, GEMINI_API_KEY, GROQ_API_KEY, etc.
  • Supabase: SUPABASE_URL, SUPABASE_ANON_KEY, SUPABASE_SERVICE_ROLE_KEY, SUPABASE_MATRIX_*

Common Commands

make dev        # dev server with reload on :8000
make run        # production mode (matrx-ai-server)
make lint       # ruff check
make fmt        # ruff format
make typecheck  # pyright
make test       # pytest -v

Publishing a New Release

./scripts/publish.sh              # patch bump  0.1.0 → 0.1.1
./scripts/publish.sh --minor      # minor bump  0.1.0 → 0.2.0
./scripts/publish.sh --major      # major bump  0.1.0 → 1.0.0
./scripts/publish.sh --message "feat: add new provider"
./scripts/publish.sh --dry-run    # preview without committing

Version History

Version Highlights
v0.1.5 Patch release
v0.1.4 Patch release
v0.1.3 Patch release
v0.1.2 Patch release
v0.1.1 Patch release
v0.1.0 Initial release — multi-provider AI orchestration, streaming, tool system, conversation persistence

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

matrx_ai-0.1.5.tar.gz (593.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

matrx_ai-0.1.5-py3-none-any.whl (441.7 kB view details)

Uploaded Python 3

File details

Details for the file matrx_ai-0.1.5.tar.gz.

File metadata

  • Download URL: matrx_ai-0.1.5.tar.gz
  • Upload date:
  • Size: 593.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for matrx_ai-0.1.5.tar.gz
Algorithm Hash digest
SHA256 52abfdfc25e9832a724101f9253cea58aaf03cc1a70653b0361e61698d40d6ed
MD5 666985f7dff322ebd12e912f3d94f5b7
BLAKE2b-256 33dfc1ac5a3c03459039fe24cc9a4756358f603f69bd4e83c6b68d267b55aacf

See more details on using hashes here.

Provenance

The following attestation bundles were made for matrx_ai-0.1.5.tar.gz:

Publisher: publish.yml on armanisadeghi/matrx-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file matrx_ai-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: matrx_ai-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 441.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for matrx_ai-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 8c1055fc40ab193def76e81e940ff665e4b7685a200ad73d491b2a1c7a4436ca
MD5 7373fd59cbde5ebbc47d152617001cb3
BLAKE2b-256 31b080a1108de386aa5601f2246888953b142cb47597e0f3d2a3b5d4cd29f4b7

See more details on using hashes here.

Provenance

The following attestation bundles were made for matrx_ai-0.1.5-py3-none-any.whl:

Publisher: publish.yml on armanisadeghi/matrx-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page