Skip to main content

Matrx AI Engine — unified AI orchestration backend with multi-provider LLM support, streaming, tool execution, and conversation persistence.

Project description

matrx-ai

Matrx AI Engine — unified AI orchestration backend with multi-provider LLM support, streaming, tool execution, and conversation persistence.

Installation

# Core library only (AI providers, tools, conversation management)
pip install matrx-ai

# With the FastAPI server layer
pip install "matrx-ai[server]"

# Or with uv
uv add matrx-ai
uv add "matrx-ai[server]"

Quick Start (library)

import matrx_ai

# Initialize once at startup (registers the database connection)
matrx_ai.initialize()

# Use the AI engine directly
from matrx_ai.orchestrator import execute_ai_request
from matrx_ai.config.unified_config import UnifiedConfig

Quick Start (server)

uv sync --extra server   # install with server deps
cp .env.example .env     # fill in your API keys
make dev                 # start dev server on :8000

Environment Variables

Copy .env.example to .env and fill in the required keys:

  • AI provider keys: OPENAI_API_KEY, ANTHROPIC_API_KEY, GEMINI_API_KEY, GROQ_API_KEY, etc.
  • Supabase: SUPABASE_URL, SUPABASE_ANON_KEY, SUPABASE_SERVICE_ROLE_KEY, SUPABASE_MATRIX_*

Common Commands

make dev        # dev server with reload on :8000
make run        # production mode (matrx-ai-server)
make lint       # ruff check
make fmt        # ruff format
make typecheck  # pyright
make test       # pytest -v

Publishing a New Release

./scripts/publish.sh              # patch bump  0.1.0 → 0.1.1
./scripts/publish.sh --minor      # minor bump  0.1.0 → 0.2.0
./scripts/publish.sh --major      # major bump  0.1.0 → 1.0.0
./scripts/publish.sh --message "feat: add new provider"
./scripts/publish.sh --dry-run    # preview without committing

Version History

Version Highlights
v0.1.4 Patch release
v0.1.3 Patch release
v0.1.2 Patch release
v0.1.1 Patch release
v0.1.0 Initial release — multi-provider AI orchestration, streaming, tool system, conversation persistence

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

matrx_ai-0.1.4.tar.gz (602.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

matrx_ai-0.1.4-py3-none-any.whl (441.8 kB view details)

Uploaded Python 3

File details

Details for the file matrx_ai-0.1.4.tar.gz.

File metadata

  • Download URL: matrx_ai-0.1.4.tar.gz
  • Upload date:
  • Size: 602.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for matrx_ai-0.1.4.tar.gz
Algorithm Hash digest
SHA256 84d46903ecf4382ab8a48dfc9f663e71c39b2396174eff29c24dcdd48fbfcaea
MD5 f166c3ec4b8bacb001eb87fbbe1d09d7
BLAKE2b-256 35c42630e36b78c990416b6dc7a579c76b513979be8cfd304df889884ac70f6f

See more details on using hashes here.

Provenance

The following attestation bundles were made for matrx_ai-0.1.4.tar.gz:

Publisher: publish.yml on armanisadeghi/matrx-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file matrx_ai-0.1.4-py3-none-any.whl.

File metadata

  • Download URL: matrx_ai-0.1.4-py3-none-any.whl
  • Upload date:
  • Size: 441.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for matrx_ai-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 7135ec8945bc10e23500438048f9885672d3bde87d3bac60d1053566e730315e
MD5 62ce24934b45f983ee0aa496318bf94f
BLAKE2b-256 7cdecbfbce1fe16f3edf9d93dd13652c726c19cbb656cfd09ebf00bcab39a30f

See more details on using hashes here.

Provenance

The following attestation bundles were made for matrx_ai-0.1.4-py3-none-any.whl:

Publisher: publish.yml on armanisadeghi/matrx-ai

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page