Skip to main content

Async orchestration nucleus for the AccuralAI local LLM ecosystem.

Project description

accuralai-core

accuralai-core is the orchestration nucleus for the AccuralAI open-source ecosystem. It provides an async pipeline that coordinates canonicalization, caching, routing, backend invocation, validation, and post-processing for local LLM text generation workflows.

This package exposes both a Python API and CLI surface while remaining transport-agnostic. Concrete functionality (canonicalizers, caches, routers, backends, validators) is supplied by sibling packages or third-party plugins via entry-point discovery.

Quick start

Install with the canonicalizer plugin for a fully working local pipeline:

pip install accuralai-core accuralai-canonicalize

Generate completions via the CLI:

accuralai-core generate \
  --prompt " Summarize   this text.  " \
  --system-prompt "You are a precise assistant." \
  --tag demo --metadata topic=news --param temperature=0.3

The accuralai-canonicalize plugin normalizes the prompt, merges metadata defaults, and creates deterministic cache keys before the request is routed to the configured backend.

Add accuralai-cache to enable TTL-aware in-memory caching:

pip install accuralai-cache

With caching enabled, repeat invocations are served instantly until the configured TTL expires.

See plan/accuralai-core-spec.md for the full architectural specification guiding implementation.

Interactive CLI

Launch the Codex-style REPL by invoking accuralai (or accuralai-cli) with no subcommand:

accuralai --config ~/.accuralai/core.toml

The shell remains active until /exit, and supports / commands for adjusting settings on the fly (/help, /backend, /model, /meta, /history, /save, /tool ..., etc.). Plain text input triggers pipeline runs using the current session defaults, and multi-line prompts are accepted by entering """ on an empty line. Tools can be listed via /tool list, executed manually with /tool run ..., and made available to the model for function calling via /tool enable <name>.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

accuralai_core-0.2.1.tar.gz (47.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

accuralai_core-0.2.1-py3-none-any.whl (70.6 kB view details)

Uploaded Python 3

File details

Details for the file accuralai_core-0.2.1.tar.gz.

File metadata

  • Download URL: accuralai_core-0.2.1.tar.gz
  • Upload date:
  • Size: 47.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.19

File hashes

Hashes for accuralai_core-0.2.1.tar.gz
Algorithm Hash digest
SHA256 7db6ebafbfbe11a2681a888b988febade4b2532f43bde43dcdd9cef2aabe956a
MD5 806affe47262f446070b753752299e74
BLAKE2b-256 3ff80dca59dff03b19b4682010ab541a0643a359fa01411d7d99dfd6a675ddee

See more details on using hashes here.

File details

Details for the file accuralai_core-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: accuralai_core-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 70.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.19

File hashes

Hashes for accuralai_core-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 6b3caee054b2ee98e12f94eefb9969af9442b97e253fc924fe0e4055b46ccf2c
MD5 b4764d407e5ebb8c911fca11750f612e
BLAKE2b-256 861b94680115069b0ab376cc6897168e91f0c9a8010ada21846b1c8f54057423

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page