LLM Client for chat responses with support for different clients and models.
Project description
Modular, backend-agnostic interface for interacting with large language models (LLMs).
This library is part of the darca-* ecosystem and provides a plug-and-play, extensible interface to communicate with LLM providers like OpenAI. It is designed with testability, structure, and future integration in mind.
—
Features
✅ Unified AIClient interface for all LLMs
🔌 OpenAI integration out of the box (GPT-4, GPT-3.5)
🧱 Extensible abstract interface (BaseLLMClient) for new providers
🧪 Full pytest support with 100% coverage
📦 Rich exception handling with structured DarcaException
📋 Markdown-aware content formatting using _strip_markdown_prefix
🧠 Logging support via darca-log-facility
—
Quickstart
Install dependencies:
make install
Run all quality checks (format, test, docs):
make check
Run tests only:
make test
Use the client:
from darca_llm import AIClient
ai = AIClient()
response = ai.get_raw_response(
system="You are a helpful assistant.",
user="What is a Python decorator?"
)
print(response)
—
Using get_file_content_response
The get_file_content_response() method allows for structured file content prompting with LLMs.
Example:
from darca_llm import AIClient
client = AIClient()
user_prompt = "Provide the content of a simple Python file."
result = client.get_file_content_response(
system="Explain the code.",
user=user_prompt
)
print(result)
This method ensures that only a single code block is returned and properly stripped of formatting using _strip_markdown_prefix().
—
Error Handling
All exceptions are subclasses of DarcaException and include:
LLMException: Base for all LLM-specific errors
LLMAPIKeyMissing: Raised when the API key is missing for the selected backend
LLMContentFormatError: Raised when: - Multiple blocks are detected within the response - The response cannot be properly stripped of markdown/code block formatting
LLMResponseError: Raised when the LLM provider returns an error or response parsing fails
All exceptions include:
error_code
message
Optional metadata
Optional cause
Full stack trace logging
—
Documentation
Build and view the docs:
make docs
Open the HTML documentation at:
docs/build/html/index.html
For detailed usage, refer to the usage.rst documentation.
—
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file darca_llm-0.2.0.tar.gz.
File metadata
- Download URL: darca_llm-0.2.0.tar.gz
- Upload date:
- Size: 6.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ac90ddd2eb62354c41fb02780d9a13fde0bd44556b2c01d8aae29ac22400d9c0
|
|
| MD5 |
3b6e446fdc2942c246557716b8e5f0ca
|
|
| BLAKE2b-256 |
31b25875442c7b1355f7357cedb1b701037ff0156dea4abdb11d4a011a4354b4
|
Provenance
The following attestation bundles were made for darca_llm-0.2.0.tar.gz:
Publisher:
cd.yml on roelkist/darca-llm
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
darca_llm-0.2.0.tar.gz -
Subject digest:
ac90ddd2eb62354c41fb02780d9a13fde0bd44556b2c01d8aae29ac22400d9c0 - Sigstore transparency entry: 189980452
- Sigstore integration time:
-
Permalink:
roelkist/darca-llm@68c08f772eb0ea17f281df5887234e2a1a499d82 -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/roelkist
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
cd.yml@68c08f772eb0ea17f281df5887234e2a1a499d82 -
Trigger Event:
release
-
Statement type:
File details
Details for the file darca_llm-0.2.0-py3-none-any.whl.
File metadata
- Download URL: darca_llm-0.2.0-py3-none-any.whl
- Upload date:
- Size: 7.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
82b57a6ce1b2e061c6d4c048491192761659fa8aa302592fe63b21705546b3fc
|
|
| MD5 |
7c2139717bf05b6e0ac4100f511309be
|
|
| BLAKE2b-256 |
36849d628e1005e14488371cdc92566310dc652c63914e8278614e1337a896db
|
Provenance
The following attestation bundles were made for darca_llm-0.2.0-py3-none-any.whl:
Publisher:
cd.yml on roelkist/darca-llm
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
darca_llm-0.2.0-py3-none-any.whl -
Subject digest:
82b57a6ce1b2e061c6d4c048491192761659fa8aa302592fe63b21705546b3fc - Sigstore transparency entry: 189980455
- Sigstore integration time:
-
Permalink:
roelkist/darca-llm@68c08f772eb0ea17f281df5887234e2a1a499d82 -
Branch / Tag:
refs/tags/v0.2.0 - Owner: https://github.com/roelkist
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
cd.yml@68c08f772eb0ea17f281df5887234e2a1a499d82 -
Trigger Event:
release
-
Statement type: