Skip to main content

LLM provider API for Kollabor

Project description

kollabor-ai

kollabor-ai is the model/provider layer for Kollabor.

It owns profile loading, provider creation, prompt rendering, context services, conversation/session helpers, token/cost accounting, and response parsing. The CLI, engine, and agent runtime should use this package instead of talking directly to provider SDKs.

Current Role

  • Normalize provider access across Anthropic, OpenAI, OpenAI Responses, Azure OpenAI, Gemini, OpenRouter, and custom OpenAI-compatible endpoints.
  • Load, validate, and resolve LLM profiles, including environment-variable and OAuth-backed credentials.
  • Render system prompts and <trender> prompt fragments.
  • Parse streaming text, thinking/reasoning blocks, and tool-call deltas.
  • Track conversation logs, session names, branch names, pricing, and context service metadata.

Architecture

Module Responsibility
api_communication_service.py high-level LLM request/streaming service
providers/ provider configs, adapters, registry, errors, transformers
profile_manager.py profile model, config/env resolution, persistence
profile_validator.py profile field validation and connection checks
prompt_renderer.py dynamic prompt rendering and <trender> support
system_prompt_builder.py assembled system prompt construction
response_parser.py / response_processor.py response and tool-call parsing
streaming_thinking_parser.py streamed thinking/reasoning extraction
conversation_manager.py / conversation_logger.py history and raw logs
context_service/ context ledger, file tracking, hash utilities, hub bridge
pricing_registry.py / cost_calculator.py model pricing and usage costs
session_naming.py / session_parser.py session metadata helpers

Usage

from kollabor_ai import APICommunicationService, LLMProfile


class DictConfig:
    def __init__(self, values):
        self.values = values

    def get(self, key, default=None):
        return self.values.get(key, default)


profile = LLMProfile(
    name="default",
    provider="anthropic",
    model="claude-3-5-sonnet-20241022",
    api_key="${ANTHROPIC_API_KEY}",
)

api = APICommunicationService(
    config=DictConfig({"kollabor.llm.enable_streaming": True}),
    raw_conversations_dir=".kollab/raw",
    profile=profile,
)

await api.initialize()
text = await api.call_llm([{"role": "user", "content": "hello"}])

Known Gaps

  • ProviderRegistry currently caches singleton instances by provider type, so callers that need strict per-profile isolation must be careful until the registry is keyed by full provider configuration or sessions create their own providers.
  • LLMProfile.to_dict() includes resolved API keys when present; API layers must explicitly redact profile dictionaries before returning them to clients.
  • Provider behavior is still partly normalized by convention. Tool-call, thinking, usage, and stop-reason contracts need broader cross-provider tests.
  • Prompt rendering can execute dynamic includes; callers must sanitize user-controlled prompts before rendering.

Roadmap

Phase 1: Provider isolation and safety

  • Key provider instances by full provider config or create session-scoped providers for clients that need isolation.
  • Add a redacted profile view helper for API/UI use.
  • Expand provider conformance tests for streaming tool calls, thinking content, token usage, and error classification.

Phase 2: Contract cleanup

  • Make public service constructors and adapter boundaries easier to use outside the CLI orchestration layer.
  • Document the canonical message/tool-call schema expected by every provider.
  • Keep provider-specific transformers behind stable package APIs.

Phase 3: Context and cost maturity

  • Document the context-service ledger and hub bridge as first-class APIs.
  • Add pricing registry refresh/versioning guidance.
  • Add stronger diagnostics for context-window and max-token decisions.

Development

Targeted validation examples:

python -m py_compile packages/kollabor-ai/src/kollabor_ai/*.py
python -m pytest tests/unit/llm tests/unit/test_context_service_hub_bridge.py -q

Dependencies

  • pydantic >= 2.0
  • aiohttp >= 3.10
  • httpx >= 0.27
  • openai >= 1.0

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kollabor_ai-1.0.1.tar.gz (157.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kollabor_ai-1.0.1-py3-none-any.whl (190.2 kB view details)

Uploaded Python 3

File details

Details for the file kollabor_ai-1.0.1.tar.gz.

File metadata

  • Download URL: kollabor_ai-1.0.1.tar.gz
  • Upload date:
  • Size: 157.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for kollabor_ai-1.0.1.tar.gz
Algorithm Hash digest
SHA256 975dc3618215ad99d2225ab8d3e9fdcdec4c8fa6e3f358af73c5d1e5d58318d6
MD5 5f0d0d6e15f775fce36dd06ee1a530b9
BLAKE2b-256 7c56d4e787a4a17c05ad9ff494a6d9ad16b08df45bbfda40c05c996dffaf57fd

See more details on using hashes here.

File details

Details for the file kollabor_ai-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: kollabor_ai-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 190.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for kollabor_ai-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 a1a3836d2cb052689e3c128cda30fa03852c902159f09d75025b234283f821de
MD5 c55de1fccc3d3d20fafcf72ec16b8ea6
BLAKE2b-256 48b31fb9c7baefdafc90ad560e6f53399f0de9742f45fb5fc7bd1799520cdcc8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page