Skip to main content

Multi-provider AI dispatcher with response caching and error handling.

Project description

cross-ai-core

PyPI version Python License: MIT

Multi-provider AI dispatcher with MD5-keyed response caching and unified error handling.

Supports Anthropic, xAI (Grok), OpenAI, Google Gemini, and Perplexity through a single consistent interface.

Requirements

  • Python 3.10 or newer (3.11 recommended for development)
  • No upper version limit — tested on 3.10–3.13

Install

Install only the provider(s) you need:

pip install "cross-ai-core[anthropic]"   # Claude
pip install "cross-ai-core[gemini]"      # Google Gemini
pip install "cross-ai-core[openai]"      # OpenAI (ChatGPT)
pip install "cross-ai-core[xai]"         # xAI Grok  (uses the OpenAI SDK)
pip install cross-ai-core                # Perplexity only (uses requests, no extra SDK)

Install all providers at once (used by cross-st, which runs all 5 simultaneously):

pip install "cross-ai-core[all]"

Dependencies

requests is always installed — it is used for the Perplexity provider and general HTTP.
The three provider SDKs are optional extras; pip installs only what you request.

Extra Package Version Providers covered
(base) requests ≥2.32.4 Perplexity
[anthropic] anthropic ≥0.84.0 Anthropic / Claude
[gemini] google-genai ≥1.65.0 Google Gemini
[openai] openai ≥1.70.0 OpenAI
[xai] openai ≥1.70.0 xAI / Grok (OpenAI-compatible API)
[all] all three above All 5 providers

Quick start

import os
from dotenv import load_dotenv
load_dotenv(os.path.expanduser("~/.crossenv"))  # your app loads keys; the library reads os.environ

from cross_ai_core import process_prompt, get_content, get_default_ai

provider = get_default_ai()         # reads DEFAULT_AI from env, falls back to "xai"
result   = process_prompt(
    provider,
    "Explain transformer attention in 3 sentences.",
    system="You are a concise technical writer.",   # omit to use each provider's default
    verbose=False,
    use_cache=True,
)
print(get_content(provider, result.response))

Configuration (environment variables)

Variable Default Purpose
DEFAULT_AI xai Default provider when none is specified
XAI_API_KEY xAI / Grok API key
ANTHROPIC_API_KEY Anthropic / Claude API key
OPENAI_API_KEY OpenAI API key
GEMINI_API_KEY Google Gemini API key
PERPLEXITY_API_KEY Perplexity API key
CROSS_API_CACHE_DIR ~/.cross_api_cache/ Response cache directory
CROSS_NO_CACHE Set to 1 to disable caching globally

The library only reads from os.environ — it never calls load_dotenv() itself.
Load your .env or ~/.crossenv before importing.
You only need to set API keys for the providers you actually use.

Caching

Responses are cached by MD5 hash of the request payload in ~/.cross_api_cache/.
The cache is safe to delete at any time.

# Bypass cache for one call
result = process_prompt(provider, prompt, verbose=False, use_cache=False)

# Check if a response was served from cache
if result.was_cached:
    print("from cache")

Development

cd ~/github/cross-ai-core
python3.11 -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]"     # installs the package + pytest + pytest-mock

Run the test suite:

python -m pytest tests/ -v

Tests use mocks — no real API keys required.

Note: Keep each repo's .venv separate; do not share it with dependent projects.

Adding a provider

  1. Create cross_ai_core/ai_<name>.py implementing BaseAIHandler (get_payload, get_client, get_cached_response, get_model, get_make, get_content, put_content, get_data_content, get_title, get_usage).
  2. Register in cross_ai_core/ai_handler.py: add to AI_HANDLER_REGISTRY and AI_LIST.

Documentation

  • API reference — all public functions, AIResponse, parallel calls, error handling
  • Providers — per-provider guide: models, API keys, strengths, free tiers
  • Changelog

Used by

Project PyPI Description
cross-st cross-st Multi-AI research reports with cross-product fact-checking. Installs this package automatically via cross-ai-core[all]. Full CLI toolkit — pipx install cross-st.

Building something with cross-ai-core? Open a PR or issue to get listed here.

License

MIT — free for personal, academic, and open-source use.
See COMMERCIAL_LICENSE.md for organizational and commercial use.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cross_ai_core-0.6.0.tar.gz (33.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cross_ai_core-0.6.0-py3-none-any.whl (27.4 kB view details)

Uploaded Python 3

File details

Details for the file cross_ai_core-0.6.0.tar.gz.

File metadata

  • Download URL: cross_ai_core-0.6.0.tar.gz
  • Upload date:
  • Size: 33.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.0

File hashes

Hashes for cross_ai_core-0.6.0.tar.gz
Algorithm Hash digest
SHA256 d959f2283fca42c7f63140d917a24bd4c247474b0c45e06a6ba6b4278f6be31d
MD5 2b3f1acd07ba815ff841bbfac5005180
BLAKE2b-256 ab85f4912c0a05bdc3521e9f65ee23ea64813d7550271bb6b82b38bc0c5d8621

See more details on using hashes here.

File details

Details for the file cross_ai_core-0.6.0-py3-none-any.whl.

File metadata

  • Download URL: cross_ai_core-0.6.0-py3-none-any.whl
  • Upload date:
  • Size: 27.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.0

File hashes

Hashes for cross_ai_core-0.6.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d4155774eb7086ed6097b5d7d7087bfda8dbff72e19633903eeca93c6d078ef7
MD5 a2887ab502ac87fe80a94db1a8d7c148
BLAKE2b-256 f1634ea13af1dd02350732bb8999f6ad9276b1866f246e4424b733287b5d1b97

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page