Skip to main content

Language Model Development Kit.

Project description

Language Model Development Kit

What it offers:

  • Simplest interface to call different Language Model APIs
  • Minimal dependencies: HTTP requests only, no third party packages
  • Streaming
  • Comfy structured outputs via Pydantic models, only if the provider / model supports it natively
  • Parallel completions
  • Unified HTTP error handling
  • Easy location config (for providers with multiple datacenters like AWS Bedrock, GCP Vertex and Azure)
  • Model fallbacks
  • Bring Your Own Key (for each provider)

What it does NOT offer:

  • Tools / function calling / MCP
  • Agents
  • Multimodality (only text-in, text-out)
  • Shady under-the-hood prompt modification (e.g. to force structured output)
  • API gateways

If you are looking for a more constrained but out-of-the-box agent interface, I'd recommend pydantic-ai or haystack-ai. If you are looking to keep granular control but extend on tools or multimodality, I'd recommend litellm or leveraging the OpenAI-compatible endpoints that providers normally set up. If you want a unified a token for all providers and are willing to give away telemetry data, check Gateways like openrouter.

Installation

uv add lmdk

Usage

from lmdk import complete

model = "mistral:mistral-small-2603"
# supports locations as in "vertex:gemini-2.5-flash@europe-west4"
Single prompt
response = complete(model=model, prompt="Tell me a joke")
Multi-turn conversation
messages = [
    UserMessage("My name is Alice."),
    AssistantMessage("Nice to meet you, Alice!"),
    UserMessage("What is my name?"),
]
response = complete(model=model, prompt=messages)
System prompt and generation kwargs
response = complete(
    model=model,
    prompt="Hi!",
    system_instruction="Talk like a pirate",
    generation_kwargs={"temperature": 0.9, "max_tokens": 10}
)
Streaming
token_iter = complete(model=model, prompt="Count from 1 to 5.", stream=True)
Model fallbacks
response = complete(model=["mistral:nonexistent-model", model], prompt="Hi")
# first request will raise NotFoundError bc model does not exist, second will work
Structured output
class Ingredient(BaseModel):
    name: str
    quantity: int
    unit: str = ""

class Recipe(BaseModel):
    ingredients: list[Ingredient]

response = complete(model=model, prompt="How do I make cheescake?", output_schema=Recipe)
# response.parsed will have a Recipe instance
Parallel calls
from lmdk import complete_batch

results = complete_batch(model=model, prompt_list=["Greet in english", "Saluda en espanyol."])
# results will be al list of CompletionResult
Template Rendering
from lmdk import render_template

# Render a template string with variables
result = render_template(
    template="Hello, {{ name }}!",
    name="World"
)
# Output: "Hello, World!"

# Render a template from a jinja file
result = render_template(
    path="path/to/template.jinja2",
    name="World"
)

Development

Structure

src/lmdk/
├── core.py         # Entry points: complete, complete_batch
├── datatypes.py    # Common message and response schemas
├── provider.py     # Base Provider class and registry
├── providers/      # Concrete implementations (Mistral, Vertex, etc.)
├── errors.py       # Unified HTTP and API error handling
└── utils.py        # Shared helper functions

Tooling

We use just for development tasks. Use:

  • just sync: Updates lockfile and syncs environment.
  • just format: Lints and formats with ruff.
  • just check-types: Static analysis with ty.
  • just check-complexity: Cyclomatic complexity checks with complexipy.
  • just test: Runs pytest with 90% coverage threshold.

See justfile for a complete list of dev commands.

Contribute

  1. Hooks: Install pre-commit hooks via just install-hooks. PRs will fail CI if linting/formatting is not applied.
  2. Issues: Open an issue first using the default template.
  3. PRs: Link your PR to the relevant issue using the PR template.

You can use just validate <model> (runs example.py) to verify which features run properly and which do not for a new provider / model. Not all of them have to pass to open a PR: some providers do not even support native structured output. Do at least the normal non-structured, non-streamed completion. The rest can raise NotImplementedError.

License

MIT

Made with mold template

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lmdk-1.6.0.tar.gz (16.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lmdk-1.6.0-py3-none-any.whl (21.0 kB view details)

Uploaded Python 3

File details

Details for the file lmdk-1.6.0.tar.gz.

File metadata

  • Download URL: lmdk-1.6.0.tar.gz
  • Upload date:
  • Size: 16.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for lmdk-1.6.0.tar.gz
Algorithm Hash digest
SHA256 b1722ec27ce7e5b9ba7e41d2100854094427c5501017e30a920382c8d57c23f2
MD5 589b525b4442b42769761bb4fa90f8a8
BLAKE2b-256 04604b2d05aa4259d4aa2daaac7969edff29631c98f9fa2800632612c2156c02

See more details on using hashes here.

File details

Details for the file lmdk-1.6.0-py3-none-any.whl.

File metadata

  • Download URL: lmdk-1.6.0-py3-none-any.whl
  • Upload date:
  • Size: 21.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.8 {"installer":{"name":"uv","version":"0.11.8","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for lmdk-1.6.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ecbc400e76e7d988aec065179b3bd459de411fe5eddd848a682a5223ac4d777d
MD5 b9b56a081d3a60198ad445cca5a1fd98
BLAKE2b-256 df07c94c57409e6c8a371c5f6b8d681845c1878dca0d63bb639a95c5b95b8b34

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page