Skip to main content

LLM provider API for Kollabor

Project description

kollabor-ai

kollabor-ai is the model/provider layer for Kollabor.

It owns profile loading, provider creation, prompt rendering, context services, conversation/session helpers, token/cost accounting, and response parsing. The CLI, engine, and agent runtime should use this package instead of talking directly to provider SDKs.

Current Role

  • Normalize provider access across Anthropic, OpenAI, OpenAI Responses, Azure OpenAI, Gemini, OpenRouter, and custom OpenAI-compatible endpoints.
  • Load, validate, and resolve LLM profiles, including environment-variable and OAuth-backed credentials.
  • Render system prompts and <trender> prompt fragments.
  • Parse streaming text, thinking/reasoning blocks, and tool-call deltas.
  • Track conversation logs, session names, branch names, pricing, and context service metadata.

Architecture

Module Responsibility
api_communication_service.py high-level LLM request/streaming service
providers/ provider configs, adapters, registry, errors, transformers
profile_manager.py profile model, config/env resolution, persistence
profile_validator.py profile field validation and connection checks
prompt_renderer.py dynamic prompt rendering and <trender> support
system_prompt_builder.py assembled system prompt construction
response_parser.py / response_processor.py response and tool-call parsing
streaming_thinking_parser.py streamed thinking/reasoning extraction
conversation_manager.py / conversation_logger.py history and raw logs
context_service/ context ledger, file tracking, hash utilities, hub bridge
pricing_registry.py / cost_calculator.py model pricing and usage costs
session_naming.py / session_parser.py session metadata helpers

Usage

from kollabor_ai import APICommunicationService, LLMProfile


class DictConfig:
    def __init__(self, values):
        self.values = values

    def get(self, key, default=None):
        return self.values.get(key, default)


profile = LLMProfile(
    name="default",
    provider="anthropic",
    model="claude-3-5-sonnet-20241022",
    api_key="${ANTHROPIC_API_KEY}",
)

api = APICommunicationService(
    config=DictConfig({"kollabor.llm.enable_streaming": True}),
    raw_conversations_dir=".kollab/raw",
    profile=profile,
)

await api.initialize()
text = await api.call_llm([{"role": "user", "content": "hello"}])

Known Gaps

  • ProviderRegistry currently caches singleton instances by provider type, so callers that need strict per-profile isolation must be careful until the registry is keyed by full provider configuration or sessions create their own providers.
  • LLMProfile.to_dict() includes resolved API keys when present; API layers must explicitly redact profile dictionaries before returning them to clients.
  • Provider behavior is still partly normalized by convention. Tool-call, thinking, usage, and stop-reason contracts need broader cross-provider tests.
  • Prompt rendering can execute dynamic includes; callers must sanitize user-controlled prompts before rendering.

Roadmap

Phase 1: Provider isolation and safety

  • Key provider instances by full provider config or create session-scoped providers for clients that need isolation.
  • Add a redacted profile view helper for API/UI use.
  • Expand provider conformance tests for streaming tool calls, thinking content, token usage, and error classification.

Phase 2: Contract cleanup

  • Make public service constructors and adapter boundaries easier to use outside the CLI orchestration layer.
  • Document the canonical message/tool-call schema expected by every provider.
  • Keep provider-specific transformers behind stable package APIs.

Phase 3: Context and cost maturity

  • Document the context-service ledger and hub bridge as first-class APIs.
  • Add pricing registry refresh/versioning guidance.
  • Add stronger diagnostics for context-window and max-token decisions.

Development

Targeted validation examples:

python -m py_compile packages/kollabor-ai/src/kollabor_ai/*.py
python -m pytest tests/unit/llm tests/unit/test_context_service_hub_bridge.py -q

Dependencies

  • pydantic >= 2.0
  • aiohttp >= 3.10
  • httpx >= 0.27
  • openai >= 1.0

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kollabor_ai-1.0.0.tar.gz (157.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kollabor_ai-1.0.0-py3-none-any.whl (190.2 kB view details)

Uploaded Python 3

File details

Details for the file kollabor_ai-1.0.0.tar.gz.

File metadata

  • Download URL: kollabor_ai-1.0.0.tar.gz
  • Upload date:
  • Size: 157.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for kollabor_ai-1.0.0.tar.gz
Algorithm Hash digest
SHA256 e1a96af5bbc53852f8a28f98c40e6d186868f5c7515ee47eafc772a598f9a946
MD5 8cf6cecc1a97ca5fe12e7a568d3c3b30
BLAKE2b-256 216562ebb9aa2fe0d4bf35eb32ee9166c7e7e7716b27b6a322d53ff959ed540f

See more details on using hashes here.

File details

Details for the file kollabor_ai-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: kollabor_ai-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 190.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for kollabor_ai-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 650fbfcf2cb5356ebdef18b1aadc4cedf3213c9d1f65fca19ea308f5465f785f
MD5 db75ea806aaf2d995a06f10131e25b64
BLAKE2b-256 9eb2a056e0e0a7eb7114865a21a582db431de678133e73ecdcbecfbb2ad8791f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page