Skip to main content

LLM Client for chat responses with support for different clients and models.

Project description

Modular, backend-agnostic interface for interacting with large language models (LLMs).

This library is part of the darca-* ecosystem and provides a plug-and-play, extensible interface to communicate with LLM providers like OpenAI. It is designed with testability, structure, and future integration in mind.

Build Status Deploy Status Codecov Black code style License PyPi GitHub Pages

Features

  • ✅ Unified AIClient interface for all LLMs

  • 🔌 OpenAI integration out of the box (GPT-4, GPT-3.5)

  • 🧱 Extensible abstract interface (BaseLLMClient) for new providers

  • 🧪 Full pytest support with 100% coverage

  • 📦 Rich exception handling with structured DarcaException

  • 📋 Markdown-aware content formatting

  • 🧠 Logging support via darca-log-facility

Quickstart

  1. Install dependencies:

make install
  1. Run all quality checks (format, test, docs):

make check
  1. Run tests only:

make test
  1. Use the client:

from darca_llm import AIClient

ai = AIClient()
response = ai.get_raw_response(
    system="You are a helpful assistant.",
    user="What is a Python decorator?"
)
print(response)

Documentation

Build and view the docs:

make docs

Open the HTML documentation at:

docs/build/html/index.html

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

darca_llm-0.1.0.tar.gz (4.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

darca_llm-0.1.0-py3-none-any.whl (5.5 kB view details)

Uploaded Python 3

File details

Details for the file darca_llm-0.1.0.tar.gz.

File metadata

  • Download URL: darca_llm-0.1.0.tar.gz
  • Upload date:
  • Size: 4.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for darca_llm-0.1.0.tar.gz
Algorithm Hash digest
SHA256 8b46b2c81a2e6b42c5768e543f474263b4687bebfd613fc3e928becdcf411ebb
MD5 fe0e9872eee9c8d75e9236ca622cd7d5
BLAKE2b-256 8bbe999f16760821c96d900d2aee0c476d6216fc231a4d0723d386d6f8f87a7c

See more details on using hashes here.

Provenance

The following attestation bundles were made for darca_llm-0.1.0.tar.gz:

Publisher: cd.yml on roelkist/darca-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file darca_llm-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: darca_llm-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 5.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for darca_llm-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0348748673fad75ca68de100a897d3f82dc6cd33cf23ed973048f83669958e21
MD5 a6dfb5cecbd54f0f1aff2553820f3877
BLAKE2b-256 7d6a11109db45cbb059fdf821f8e863062e6809254b0b6c02728007f108f9fa0

See more details on using hashes here.

Provenance

The following attestation bundles were made for darca_llm-0.1.0-py3-none-any.whl:

Publisher: cd.yml on roelkist/darca-llm

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page