Skip to main content

Async orchestration nucleus for the AccuralAI local LLM ecosystem.

Project description

accuralai-core

accuralai-core is the orchestration nucleus for the AccuralAI open-source ecosystem. It provides an async pipeline that coordinates canonicalization, caching, routing, backend invocation, validation, and post-processing for local LLM text generation workflows.

This package exposes both a Python API and CLI surface while remaining transport-agnostic. Concrete functionality (canonicalizers, caches, routers, backends, validators) is supplied by sibling packages or third-party plugins via entry-point discovery.

Quick start

Install with the canonicalizer plugin for a fully working local pipeline:

pip install accuralai-core accuralai-canonicalize

Generate completions via the CLI:

accuralai-core generate \
  --prompt " Summarize   this text.  " \
  --system-prompt "You are a precise assistant." \
  --tag demo --metadata topic=news --param temperature=0.3

The accuralai-canonicalize plugin normalizes the prompt, merges metadata defaults, and creates deterministic cache keys before the request is routed to the configured backend.

Add accuralai-cache to enable TTL-aware in-memory caching:

pip install accuralai-cache

With caching enabled, repeat invocations are served instantly until the configured TTL expires.

See plan/accuralai-core-spec.md for the full architectural specification guiding implementation.

Interactive CLI

Launch the Codex-style REPL by invoking accuralai (or accuralai-cli) with no subcommand:

accuralai --config ~/.accuralai/core.toml

The shell remains active until /exit, and supports / commands for adjusting settings on the fly (/help, /backend, /model, /meta, /history, /save, /tool ..., etc.). Plain text input triggers pipeline runs using the current session defaults, and multi-line prompts are accepted by entering """ on an empty line. Tools can be listed via /tool list, executed manually with /tool run ..., and made available to the model for function calling via /tool enable <name>.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

accuralai_core-0.1.0.tar.gz (42.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

accuralai_core-0.1.0-py3-none-any.whl (66.1 kB view details)

Uploaded Python 3

File details

Details for the file accuralai_core-0.1.0.tar.gz.

File metadata

  • Download URL: accuralai_core-0.1.0.tar.gz
  • Upload date:
  • Size: 42.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.18

File hashes

Hashes for accuralai_core-0.1.0.tar.gz
Algorithm Hash digest
SHA256 6cfdd2ec7991ce971866394f56d9f56f78a06503cdedc40850045aa847ddb9cf
MD5 9bc20bc9d02b236f844688172364a6ab
BLAKE2b-256 cb76e497513be5b8e775fa1b35bb90bef557418bab7a1ee4c63ce32790940fa7

See more details on using hashes here.

File details

Details for the file accuralai_core-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: accuralai_core-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 66.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.18

File hashes

Hashes for accuralai_core-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2f7c8d080f43f1b54500bcc08f611946293c969705f39a6138c6cb5216189aa4
MD5 a7548087abeb8d9fda8cf45655322880
BLAKE2b-256 189b07323fa8688c5bda6c5d562fea5060d888205d587c44eff44524d9dc8e1e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page