A unified interface for querying multiple LLM providers via REST APIs
Project description
LLMConnect
A unified Python library for querying multiple Large Language Models (LLMs) using direct REST API calls.
Features
- Multi-Provider Support: Connect to OpenAI, Claude, Gemini, Perplexity, Mistral, DeepSeek, and LLaMA
- Auto-Detection: Automatically detects the provider based on model name prefix
- Simple Interface: Just provide model name and API key
- Customizable: Optional temperature and max_tokens parameters
- No SDKs Required: Uses direct REST API calls via requests library
Installation
pip install llmconnect
Quick Start
from llmconnect import LLMConnect
# Initialize with your preferred model
client = LLMConnect("gpt-4", "your-openai-api-key")
# Send a message
response = client.chat("Hello, how are you?")
print(response)
Supported Models
- OpenAI: gpt-4, gpt-3.5-turbo, etc.
- Claude: claude-3-sonnet, claude-3-opus, etc.
- Gemini: gemini-pro, gemini-ultra, etc.
- Perplexity: sonar-pro, sonar-medium, etc.
- Mistral: mistral-large-latest, mistral-medium, etc.
- DeepSeek: deepseek-chat, deepseek-coder, etc.
- LLaMA: llama3-70b, llama2-13b, etc.
Advanced Usage
# Custom parameters
client = LLMConnect(
model_name="claude-3-sonnet",
api_key="your-claude-api-key",
temperature=0.9,
max_tokens=2048
)
response = client.chat("Write a creative story")
API Keys
You'll need API keys from the respective providers:
- OpenAI: https://platform.openai.com/
- Anthropic (Claude): https://console.anthropic.com/
- Google (Gemini): https://makersuite.google.com/
- Perplexity: https://www.perplexity.ai/
- Mistral: https://console.mistral.ai/
- DeepSeek: https://platform.deepseek.com/
- LLaMA: https://www.llama-api.com/
License
MIT License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
llm_api_bridge-1.0.0.tar.gz
(7.1 kB
view details)
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_api_bridge-1.0.0.tar.gz.
File metadata
- Download URL: llm_api_bridge-1.0.0.tar.gz
- Upload date:
- Size: 7.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e465a02b2e3513264004a1e8f9dca59f7c4271deb40adfcb912632e048ed0809
|
|
| MD5 |
393b4ee8f3bdc1bc027a26d0a49290e2
|
|
| BLAKE2b-256 |
5a4e612289e4cd44d29a9ed8be604dc04973e77b5f6b187dde03e19478a2e175
|
File details
Details for the file llm_api_bridge-1.0.0-py3-none-any.whl.
File metadata
- Download URL: llm_api_bridge-1.0.0-py3-none-any.whl
- Upload date:
- Size: 12.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8111c88463b78472ca09dcdc774afb91d50e5410a12ff41be1c2e2fbc1551cf8
|
|
| MD5 |
74825f24284817a438f667f81101e98d
|
|
| BLAKE2b-256 |
c237b0d44e7f4a0e6fcfef1d7e8c16215f3371ffddc5437a817211bd2f1e0c1a
|