Skip to main content

A flexible Python factory for working with multiple Large Language Model (LLM) providers (OpenAI, Anthropic, Gemini, Llama) using a unified interface, with robust configuration and extensibility.

This project has been archived.

The maintainers of this project have marked this project as archived. No new releases are expected.

Project description

llm-factory

A flexible Python factory for working with multiple Large Language Model (LLM) providers (OpenAI, Anthropic, Gemini, Llama) using a unified interface, with robust configuration and extensibility.


Features

  • ✅ Unified interface for multiple LLM providers (OpenAI, Anthropic, Gemini, Llama)
  • ✅ Easy provider switching via configuration
  • ✅ Pydantic-based response validation
  • ✅ Environment variable-based secure configuration
  • ✅ Extensible for new providers
  • ✅ Supports model, temperature, max tokens, and retries per provider

Installation

pip install python-llm-factory

Configuration

The package uses environment variables for authentication and configuration. You can set these in a .env file or your environment:

# Required environment variables for each provider
OPENAI_API_KEY=your_openai_api_key
ANTHROPIC_API_KEY=your_anthropic_api_key
GEMINI_API_KEY=your_gemini_api_key

Examples

Basic Usage: Creating a Completion

from pydantic import BaseModel, Field
from python_llm_factory.llm_factory import LLMProvider, LLMFactory


class CompletionModel(BaseModel):
    response: str = Field(description="Your response to the user.")
    reasoning: str = Field(description="Explain your reasoning for the response.")


messages = [
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user",
     "content": "If it takes 2 hours to dry 1 shirt out in the sun, how long will it take to dry 5 shirts?"},
]

llm = LLMFactory(provider=LLMProvider.GEMINI)
completion = llm.create_completion(
    response_model=CompletionModel,
    messages=messages,
)
print(f"Response: {completion.response}\n")
print(f"Reasoning: {completion.reasoning}")

🤝 Contributing

If you have a helpful tool, pattern, or improvement to suggest:

  • Fork the repo
  • Create a new branch
  • Submit a pull request
    I welcome additions that promote clean, productive, and maintainable development.

🙏 Thanks

Thanks for exploring this repository!
Happy coding!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

python_llm_factory-0.0.2.tar.gz (5.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

python_llm_factory-0.0.2-py3-none-any.whl (5.6 kB view details)

Uploaded Python 3

File details

Details for the file python_llm_factory-0.0.2.tar.gz.

File metadata

  • Download URL: python_llm_factory-0.0.2.tar.gz
  • Upload date:
  • Size: 5.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for python_llm_factory-0.0.2.tar.gz
Algorithm Hash digest
SHA256 b4980eab0ec769b61781ff7ee71b5884373f010aed8bdf7794999126f1e0a13d
MD5 1021243111f76ea6a656e9ba62fbca70
BLAKE2b-256 027f5fc3f9d7e0000e67529407d6f9a8f24fa0cf2e9d82692ff3e49790c1e3dd

See more details on using hashes here.

File details

Details for the file python_llm_factory-0.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for python_llm_factory-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 9c6269d46fe408f3ce58f336e62e1e8b03ca91354906a1e15e6b90e717ff73bd
MD5 862c43a517591ada1ad4a9579f4d1b01
BLAKE2b-256 a683c28c7dd8a6bdf7d4ea431991bb6aa5e2d15f19c99101858ba37094390853

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page