Skip to main content

A wrapper for various large language models including GPT, Claude, and Gemini

Project description

# LLM-Wrapper

LLM-Wrapper is a Python package that provides a unified interface for interacting with multiple Large Language Models (LLMs) including ChatGPT, Claude, and Gemini.

## Features

- Easy initialization of LLM clients
- Unified interface for generating outputs from different LLMs
- Support for multiple models within each LLM platform
- API key validation during initialization

## Installation

```bash
pip install llm-wrapper

Usage

Initializing LLMs

You can initialize LLMs individually or all at once:

from LLM import Initialize

# Initialize ChatGPT
Initialize.init_chatgpt("your_openai_api_key")

# Initialize Claude
Initialize.init_claude("your_anthropic_api_key")

# Initialize Gemini
Initialize.init_gemini("your_gemini_api_key")

# Initialize all LLMs at once
Initialize.init_all(
    chatgpt_api_key="your_openai_api_key",
    claude_api_key="your_anthropic_api_key",
    gemini_api_key="your_gemini_api_key"
)

Note: During initialization, a few tokens are used to verify that the provided API key is correct.

Generating Output

from LLM import Output

# Generate output using ChatGPT
gpt_response = Output.GPT("Tell me a joke about programming.")

# Generate output using Claude
claude_response = Output.Claude("Explain quantum computing in simple terms.")

# Generate output using Gemini
gemini_response = Output.Gemini("What are the benefits of renewable energy?")

Customizing Model Parameters

You can customize model parameters when generating output:

# Using a specific GPT model with custom temperature and max tokens
gpt_response = Output.GPT(
    "Summarize the history of artificial intelligence.",
    model="gpt-4o-mini-2024-07-18",
    temperature=0.7,
    max_tokens=2048
)

# Using a specific Claude model with custom temperature and max tokens
claude_response = Output.Claude(
    "Describe the process of photosynthesis.",
    model="claude-3-5-sonnet-20240620",
    temperature=0.5,
    max_tokens=1000
)

Available Models

You can get information about available LLM models using the get_llm_info() function:

from LLM.LLMModels import get_llm_info

llm_info = get_llm_info()

To get just the list of available models, you can use the LLM_MODELS dictionary:

from LLM.LLMModels import LLM_MODELS

available_models = LLM_MODELS

Note: While the get_llm_info() function includes information about image upload support, this functionality is not currently implemented in the LLM-Wrapper.

Error Handling

The package includes error handling for invalid API keys, unsupported models, and incorrect parameter values. Make sure to handle these exceptions in your code.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

License

This project is licensed under the MIT License.

This README provides an overview of the LLM-Wrapper package, including installation  instructions, usage examples for initializing LLMs and generating output, and information about customizing model parameters. It also mentions the API key validation during initialization and how to access the list of available models.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

allllms-0.1.1.tar.gz (6.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

AllLLMs-0.1.1-py3-none-any.whl (6.7 kB view details)

Uploaded Python 3

File details

Details for the file allllms-0.1.1.tar.gz.

File metadata

  • Download URL: allllms-0.1.1.tar.gz
  • Upload date:
  • Size: 6.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.2

File hashes

Hashes for allllms-0.1.1.tar.gz
Algorithm Hash digest
SHA256 3630dee043d42b0b17c04c60d2915540219c26064ec249426d625cc06a39307c
MD5 ad9cb556d89f598c700de73009c2667c
BLAKE2b-256 9f6a22b461464e0733b583de3702805998d3f02e0d7fb22b021dec12eb5aa14f

See more details on using hashes here.

File details

Details for the file AllLLMs-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: AllLLMs-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 6.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.2

File hashes

Hashes for AllLLMs-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 617d5d78130c21d0f9b2bda0089065a038d4d518f974a54cb50e3cb40fb7e5e9
MD5 007b41874f86761794cbd79f1c4fe688
BLAKE2b-256 70e629a021fd12cbb0cda9a9083178a703f6f7121b34f07f29f17f69be5fac12

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page