Skip to main content

A simple and easy-to-use LLM API Wrapper

Project description

Simple LLM API

A simple and easy-to-use Python wrapper for popular LLM APIs (OpenAI, Anthropic, and more).

Installation

uv is recommended for managing and installing packages in isolated environments.

uv add simple-llm-api

You can also install it using pip:

pip install simple-llm-api

Features

  • 🎯 Simple and consistent interface for multiple LLM providers
  • 🤖 Support for OpenAI, Anthropic, Google Gemini, Mistral, DeepSeek, and local LLMs
  • 🏠 Local LLM support (run locally hosted models on your own machine)
  • 🚀 Easy to use with minimal configuration
  • ⚙️ Customizable parameters for each provider
  • 🔧 **kwargs support for additional API parameters

Quick Start

OpenAI

from simple_llm_api import OpenAIAPI

openai = OpenAIAPI("YOUR_API_KEY")
response = openai.simple_request("Hi!")
print(response)

Anthropic

from simple_llm_api import AnthropicAPI

anthropic = AnthropicAPI("YOUR_API_KEY")
response = anthropic.simple_request("Hi!")
print(response)

Google Gemini

from simple_llm_api import GeminiAPI

gemini = GeminiAPI("YOUR_API_KEY")
response = gemini.simple_request("Hi!")
print(response)

Mistral

from simple_llm_api import MistralAPI

mistral = MistralAPI("YOUR_API_KEY")
response = mistral.simple_request("Hi!")
print(response)

DeepSeek

from simple_llm_api import DeepSeekAPI

deepseek = DeepSeekAPI("YOUR_API_KEY")
response = deepseek.simple_request("Hi!")
print(response)

Local LLMs

Use locally hosted models on your computer that work like OpenAI's API (like LM Studio or Ollama).

from simple_llm_api import OpenAIAPI

openai = OpenAIAPI(model="MODEL_NAME")
openai._openai_endpoint = "http://localhost:8080/v1/chat/completions"
response = openai.simple_request("Hi!")
print(response)

Advanced Usage

Each API wrapper supports various parameters for customizing the response, plus **kwargs for additional API-specific parameters:

OpenAI

openai.simple_request(
    user_prompt="Your prompt here",
    system_prompt="Custom system prompt",
    temperature=1,
    top_p=1,
    max_completion_tokens=2048
)

Anthropic

anthropic.simple_request(
    user_prompt="Your prompt here",
    system_prompt="Custom system prompt",
    temperature=1,
    max_tokens=2048
)

Gemini

gemini.simple_request(
    user_prompt="Your prompt here",
    system_prompt="Custom system prompt",
    temperature=1,
    top_k=40,
    top_p=0.95,
    max_output_tokens=2048
)

Mistral

mistral.simple_request(
    user_prompt="Your prompt here",
    system_prompt="Custom system prompt",
    temperature=0.7,
    top_p=1,
    max_tokens=2048
)

DeepSeek

deepseek.simple_request(
    user_prompt="Your prompt here",
    system_prompt="Custom system prompt",
    temperature=1,
    top_p=1,
    max_tokens=2048
)

Error Handling

The library includes custom exceptions for each API:

  • OpenAIError: OpenAIAPI Error
  • AnthropicError: AnthropicAPI Error
  • GeminiError: GeminiAPI Error
  • MistralError: MistralAPI Error
  • DeepSeekError: DeepSeekAPI Error

Disclaimer

This software is provided "as is" without any warranty. The authors are not responsible for any problems that may happen when you use this software.

This library connects to third-party LLM APIs (OpenAI, Anthropic, Google Gemini, Mistral, and DeepSeek). You must follow the rules of these APIs and manage any costs yourself.

You are responsible for how you use this software and what you do with it.

Using this software means you accept these terms.

License

This project is licensed under the MIT License.

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

simple_llm_api-1.3.1.tar.gz (6.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

simple_llm_api-1.3.1-py3-none-any.whl (6.2 kB view details)

Uploaded Python 3

File details

Details for the file simple_llm_api-1.3.1.tar.gz.

File metadata

  • Download URL: simple_llm_api-1.3.1.tar.gz
  • Upload date:
  • Size: 6.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.22

File hashes

Hashes for simple_llm_api-1.3.1.tar.gz
Algorithm Hash digest
SHA256 7c0d253a70bfc6f2ea8b207026098e50f8d2fe56a5c5ddb0443f3fdaa9a2cdfe
MD5 6802a7d60ed354acab3ccd4e1d4c364c
BLAKE2b-256 b5d0bcbfaed99d413eff1824a877bb56526c885d7539de1f4d459d79da4e375e

See more details on using hashes here.

File details

Details for the file simple_llm_api-1.3.1-py3-none-any.whl.

File metadata

File hashes

Hashes for simple_llm_api-1.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f69a1f73821dfdc143e9b3a955f2428f1b9a57aeac13352602050db5ac1ea6d2
MD5 c99af836aa5bb8a5e39c335f96a18aca
BLAKE2b-256 e61f119648e94ee6aea6945673dadacede9e588277292297898b6490f674fb84

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page