Skip to main content

Simple, unified interface for all major LLMs

Project description

dazllm 🚀

Simple, unified interface for all major LLMs

Stop juggling different APIs and libraries. dazllm gives you a clean, consistent way to chat with any LLM - from GPT-4 and Claude to local Ollama or LM Studio models.

Features

Unified API - Same interface for OpenAI, Anthropic, Google, and local models (Ollama, LM Studio) 🔧 Smart Model Selection - Choose by name, type, or let it auto-select
🔐 Secure Configuration - API keys stored safely in system keyring
📝 Structured Output - Get Pydantic models directly from LLM responses
🎨 Image Generation - Create images with DALL-E and more
💻 CLI & Python API - Use from command line or import in your code

Quick Start

Installation

pip install dazllm

Setup

Configure your API keys using keyring:

keyring set dazllm openai_api_key YOUR_OPENAI_KEY
keyring set dazllm anthropic_api_key YOUR_ANTHROPIC_KEY
keyring set dazllm google_api_key YOUR_GOOGLE_KEY
keyring set dazllm ollama_url http://localhost:11434
keyring set dazllm lmstudio_url http://localhost:1234

Check everything is working:

dazllm --check

Usage

Command Line

# Simple chat
dazllm chat "What's the capital of France?"

# Use specific model  
dazllm chat --model openai:gpt-4 "Explain quantum computing"

# Use model type (auto-selects best available)
dazllm chat --model-type paid_best "Write a poem"

# Use provider default
dazllm chat --model openai "Tell me about AI"

# Structured output
dazllm structured "List 3 colors" --schema '{"type":"array","items":{"type":"string"}}'

# Generate images
dazllm image "a red cat wearing a hat" cat.png

# From file
dazllm chat --file prompt.txt --output response.txt

Python API

from dazllm import Llm, ModelType
from pydantic import BaseModel

# Instance-based usage
llm = Llm("openai:gpt-4")
response = llm.chat("Hello!")

# Static/module-level usage
response = Llm.chat("Hello!", model="anthropic:claude-3-5-sonnet-20241022")
response = Llm.chat("Hello!", model_type=ModelType.PAID_BEST)

# Structured output with Pydantic
class ColorList(BaseModel):
    colors: list[str]

result = Llm.chat_structured("List 3 colors", ColorList)
print(result.colors)  # ['red', 'green', 'blue']

# Image generation
Llm.image("a sunset over mountains", "sunset.png")

# Conversation history
conversation = [
    {"role": "user", "content": "Hello"},
    {"role": "assistant", "content": "Hi there!"},
    {"role": "user", "content": "What's your name?"}
]
response = Llm.chat(conversation, model="ollama:mistral-small")

Model Types

Instead of remembering model names, use semantic types:

  • local_small - ~1B parameter models (fast, basic)
  • local_medium - ~7B parameter models (good balance)
  • local_large - ~14B parameter models (best local quality)
  • paid_cheap - Cost-effective cloud models
  • paid_best - Highest quality cloud models

Model Format

All models use the format provider:model:

  • OpenAI: openai:gpt-4o, openai:gpt-4o-mini, openai:dall-e-3
  • Anthropic: anthropic:claude-3-5-sonnet-20241022, anthropic:claude-3-haiku-20240307
  • Google: google:gemini-pro, google:gemini-flash
  • Ollama: ollama:mistral-small, ollama:llama3:8b, ollama:codellama:7b
  • LM Studio: lm-studio:mistral, lm-studio:llama3

You can also use just the provider name (e.g., openai) to use that provider's default model.

Configuration

API keys are stored securely in your system keyring:

# Set API keys
keyring set dazllm openai_api_key YOUR_OPENAI_KEY
keyring set dazllm anthropic_api_key YOUR_ANTHROPIC_KEY
keyring set dazllm google_api_key YOUR_GOOGLE_KEY
keyring set dazllm ollama_url http://localhost:11434
keyring set dazllm lmstudio_url http://localhost:1234

# Set default model (optional)
keyring set dazllm default_model openai:gpt-4o

# Check what's configured
dazllm --check

Examples

Building a Chatbot

from dazllm import Llm

def chatbot():
    llm = Llm.model_named("openai:gpt-4o")
    conversation = []
    
    while True:
        user_input = input("You: ")
        if user_input.lower() == 'quit':
            break
            
        conversation.append({"role": "user", "content": user_input})
        response = llm.chat(conversation)
        conversation.append({"role": "assistant", "content": response})
        
        print(f"AI: {response}")

chatbot()

Data Extraction

from dazllm import Llm
from pydantic import BaseModel

class Person(BaseModel):
    name: str
    age: int
    city: str

class People(BaseModel):
    people: list[Person]

text = "John Doe, age 30, lives in New York. Jane Smith, age 25, lives in LA."

result = Llm.chat_structured(
    f"Extract people info from: {text}",
    People,
    model="openai:gpt-4o-mini"
)

for person in result.people:
    print(f"{person.name} is {person.age} years old and lives in {person.city}")

Image Generation Pipeline

from dazllm import Llm

# Generate image description
description = Llm.chat(
    "Describe a serene mountain landscape in detail",
    model_type="paid_cheap"
)

# Generate the image
image_path = Llm.image(description, "mountain.png", width=1024, height=768)
print(f"Image saved to {image_path}")

Requirements

  • Python 3.8+
  • API keys for desired providers (OpenAI, Anthropic, Google)
  • Ollama or LM Studio installed for local models

License

MIT License

Contributing

Contributions welcome! Please see the GitHub repository for guidelines.


dazllm - Making LLMs accessible to everyone! 🚀

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dazllm-0.28.0.tar.gz (549.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dazllm-0.28.0-py3-none-any.whl (67.8 kB view details)

Uploaded Python 3

File details

Details for the file dazllm-0.28.0.tar.gz.

File metadata

  • Download URL: dazllm-0.28.0.tar.gz
  • Upload date:
  • Size: 549.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for dazllm-0.28.0.tar.gz
Algorithm Hash digest
SHA256 cdaa5aa969016491b3930afa266e85c084178b3310456aa32b498334ead36e7c
MD5 389953b9747e8ebd23e7b0b42de0be73
BLAKE2b-256 af032ff21593aa6bae2754937ac69dc064ade897d7d2495de4a26e7ec579f40f

See more details on using hashes here.

File details

Details for the file dazllm-0.28.0-py3-none-any.whl.

File metadata

  • Download URL: dazllm-0.28.0-py3-none-any.whl
  • Upload date:
  • Size: 67.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.5

File hashes

Hashes for dazllm-0.28.0-py3-none-any.whl
Algorithm Hash digest
SHA256 271dcfbe26eefdf833b583a558bca4522d810b530c6f7982d116a07fe589d784
MD5 90e54fee32746460ca05677b432676e8
BLAKE2b-256 6ce3f0b9cfef9de44e40ada759d5b544c8e1a37b73df15446056009949db95aa

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page