Python client for GPUniq GPU Meta-Cloud platform
Project description
GPUniq
GPUniq
GPUniq is a Python client for the GPUniq GPU Meta-Cloud platform. GPUniq is not just an LLM aggregator—it's a full-fledged meta-cloud system for running GPU computations with automatic provider selection, failover mechanisms, and snapshot system to ensure reliability and minimize costs.
📌 Official website: gpuniq.com
🚀 Features
🤖 Multiple LLM models — access to OpenAI, GLM, Qwen, DeepSeek, and other models through a unified API. 💬 Simple interface — just a couple of lines of code to send requests. ⚡ Automatic failover — if a failure occurs, the task is instantly transferred to another machine. 💾 Snapshot system — progress is saved automatically, no computations are lost. 🏗️ Multi-tier architecture — Own GPUs → Client GPUs → Third-party providers. 💰 Save up to 70% — automatic selection of the most cost-effective GPUs in real-time. 🔐 Security — authentication via API keys.
📚 Installation
Install the library via PyPI:
pip install GPUniq
🛠️ Getting Started
1️⃣ Initialize the client
Connect GPUniq to your project:
import gpuniq
# Initialize client with API key
client = gpuniq.init("gpuniq_your_api_key_here")
2️⃣ Simple LLM request
Send a message to a language model:
response = client.request(
"openai/gpt-oss-120b",
"Hello, how are you?"
)
print(response)
3️⃣ Error handling
Handle API errors:
from gpuniq import GPUniqError
try:
response = client.request("invalid-model", "Hello!")
print(response)
except GPUniqError as e:
print(f"Error: {e.message}")
print(f"Error code: {e.error_code}")
print(f"HTTP status: {e.http_status}")
When using a non-existent model, the list of available models is automatically displayed:
Error: Invalid model
Available models:
- zai-org/GLM-4.6
- openai/gpt-oss-120b
- Qwen/Qwen3-Coder-480B-A35B-Instruct
...
🛠️ API Methods
| Method | Description |
|---|---|
init(api_key) |
Initializes the client with an API key |
request(model, message) |
Sends a request to the LLM |
Detailed method descriptions
gpuniq.init(api_key: str) -> GPUniqClient
Initializes and returns a GPUniq client.
Parameters:
api_key(str): Your GPUniq API key (starts with 'gpuniq_')
Returns:
GPUniqClient: Client instance
GPUniqClient.request(model: str, message: str, role: str = "user", timeout: int = 30) -> str
Sends a simple request to a language model.
Parameters:
model(str): Model identifier (e.g., 'openai/gpt-oss-120b')message(str): Message textrole(str, optional): Message role (default: 'user')timeout(int, optional): Request timeout in seconds (default: 30)
Returns:
str: Response from the language model
🎯 Available Models
zai-org/GLM-4.6openai/gpt-oss-120bQwen/Qwen3-Coder-480B-A35B-InstructQwen/Qwen3-235B-A22B-Instruct-2507Qwen/Qwen3-Next-80B-A3B-InstructQwen/QwQ-32BQwen/Qwen2.5-Coder-32B-Instructdeepseek-ai/DeepSeek-R1-Distill-Llama-70Bmeta-llama/Llama-3.3-70B-Instructt-tech/T-lite-it-1.0t-tech/T-pro-it-1.0t-tech/T-pro-it-2.0
📝 License
This project is distributed under the MIT license.
📌 Official website: gpuniq.com 📌 PyPI: GPUniq on PyPI 📌 GitHub: GPUniq on GitHub
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file gpuniq-1.0.6.tar.gz.
File metadata
- Download URL: gpuniq-1.0.6.tar.gz
- Upload date:
- Size: 4.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2e95aaf6b0e6b1cc3a838e53a61f606d660d082aaf5dc9c7a37ce8a591f1e050
|
|
| MD5 |
9562cea38ce46b7ab9a0c24ee3085b01
|
|
| BLAKE2b-256 |
367cc8de38cf0274d8cb5b9d952638b9ca3513535ba555235b53e79a14bc5e13
|
File details
Details for the file gpuniq-1.0.6-py3-none-any.whl.
File metadata
- Download URL: gpuniq-1.0.6-py3-none-any.whl
- Upload date:
- Size: 4.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
46989ae7bc1622cc5df38888d9ea312f70b76532cb9cf1e9877b82b90edee70e
|
|
| MD5 |
4e94f4f304dcb8eb979ead80e253c9cd
|
|
| BLAKE2b-256 |
0da248b422ff91bffb547cf9757e989c74a5893b218cadc1afa6207943749075
|