Skip to main content

LLM provider API for Kollabor

Project description

kollabor-ai

kollabor-ai is the model/provider layer for Kollabor.

It owns profile loading, provider creation, prompt rendering, context services, conversation/session helpers, token/cost accounting, and response parsing. The CLI, engine, and agent runtime should use this package instead of talking directly to provider SDKs.

Current Role

  • Normalize provider access across Anthropic, OpenAI, OpenAI Responses, Azure OpenAI, Gemini, OpenRouter, and custom OpenAI-compatible endpoints.
  • Load, validate, and resolve LLM profiles, including environment-variable and OAuth-backed credentials.
  • Render system prompts and <trender> prompt fragments.
  • Parse streaming text, thinking/reasoning blocks, and tool-call deltas.
  • Track conversation logs, session names, branch names, pricing, and context service metadata.

Architecture

Module Responsibility
api_communication_service.py high-level LLM request/streaming service
providers/ provider configs, adapters, registry, errors, transformers
profile_manager.py profile model, config/env resolution, persistence
profile_validator.py profile field validation and connection checks
prompt_renderer.py dynamic prompt rendering and <trender> support
system_prompt_builder.py assembled system prompt construction
response_parser.py / response_processor.py response and tool-call parsing
streaming_thinking_parser.py streamed thinking/reasoning extraction
conversation_manager.py / conversation_logger.py history and raw logs
context_service/ context ledger, file tracking, hash utilities, hub bridge
pricing_registry.py / cost_calculator.py model pricing and usage costs
session_naming.py / session_parser.py session metadata helpers

Usage

from kollabor_ai import APICommunicationService, LLMProfile


class DictConfig:
    def __init__(self, values):
        self.values = values

    def get(self, key, default=None):
        return self.values.get(key, default)


profile = LLMProfile(
    name="default",
    provider="anthropic",
    model="claude-3-5-sonnet-20241022",
    api_key="${ANTHROPIC_API_KEY}",
)

api = APICommunicationService(
    config=DictConfig({"kollabor.llm.enable_streaming": True}),
    raw_conversations_dir=".kollabor/raw",
    profile=profile,
)

await api.initialize()
text = await api.call_llm([{"role": "user", "content": "hello"}])

Known Gaps

  • ProviderRegistry currently caches singleton instances by provider type, so callers that need strict per-profile isolation must be careful until the registry is keyed by full provider configuration or sessions create their own providers.
  • LLMProfile.to_dict() includes resolved API keys when present; API layers must explicitly redact profile dictionaries before returning them to clients.
  • Provider behavior is still partly normalized by convention. Tool-call, thinking, usage, and stop-reason contracts need broader cross-provider tests.
  • Prompt rendering can execute dynamic includes; callers must sanitize user-controlled prompts before rendering.

Roadmap

Phase 1: Provider isolation and safety

  • Key provider instances by full provider config or create session-scoped providers for clients that need isolation.
  • Add a redacted profile view helper for API/UI use.
  • Expand provider conformance tests for streaming tool calls, thinking content, token usage, and error classification.

Phase 2: Contract cleanup

  • Make public service constructors and adapter boundaries easier to use outside the CLI orchestration layer.
  • Document the canonical message/tool-call schema expected by every provider.
  • Keep provider-specific transformers behind stable package APIs.

Phase 3: Context and cost maturity

  • Document the context-service ledger and hub bridge as first-class APIs.
  • Add pricing registry refresh/versioning guidance.
  • Add stronger diagnostics for context-window and max-token decisions.

Development

Targeted validation examples:

python -m py_compile packages/kollabor-ai/src/kollabor_ai/*.py
python -m pytest tests/unit/llm tests/unit/test_context_service_hub_bridge.py -q

Dependencies

  • pydantic >= 2.0
  • aiohttp >= 3.10
  • httpx >= 0.27
  • openai >= 1.0

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kollabor_ai-0.5.7.tar.gz (157.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kollabor_ai-0.5.7-py3-none-any.whl (189.3 kB view details)

Uploaded Python 3

File details

Details for the file kollabor_ai-0.5.7.tar.gz.

File metadata

  • Download URL: kollabor_ai-0.5.7.tar.gz
  • Upload date:
  • Size: 157.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for kollabor_ai-0.5.7.tar.gz
Algorithm Hash digest
SHA256 eb02cda0d02d30a7e1cde41c787c1545d809c06763401092018d135fade4b27b
MD5 39edaafbee02331a9dc853bc6db33d8c
BLAKE2b-256 54bbd5ce3380ad51c5db7a18bb758fb5faad2528732181aa9516f9659f64a9eb

See more details on using hashes here.

File details

Details for the file kollabor_ai-0.5.7-py3-none-any.whl.

File metadata

  • Download URL: kollabor_ai-0.5.7-py3-none-any.whl
  • Upload date:
  • Size: 189.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for kollabor_ai-0.5.7-py3-none-any.whl
Algorithm Hash digest
SHA256 7ff2523bee7aa698ec1836f03b1902897f33ef7580009433b7daad5f17733fb3
MD5 9626a69f06bbac86cc2b39d828b9631a
BLAKE2b-256 2d1908d7c4383d1dc43cf2888a99c825b4bdb7b90a6600ca7aa94f6b91c222b0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page