A client library for the OpenWebUI API, compatible with OpenAI Python SDK
Project description
OpenWebUI Client
A client library for the OpenWebUI API, compatible with the OpenAI Python SDK but with extensions specific to OpenWebUI features.
Installation
pip install openwebui-client
Quick Start
from openwebui_client import OpenWebUIClient
# Initialize client
client = OpenWebUIClient(
api_key="your_api_key", # Optional if set in environment variable
base_url="http://your-openwebui-instance:5000",
default_model="gpt-4"
)
# Basic chat completion
response = client.chat.completions.create(
model="gpt-4",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello, world!"}
]
)
print(response.choices[0].message.content)
Using Function Calling / Tools
The client supports OpenAI-compatible function calling with tools:
# Direct tool usage with chat completions
response = client.chat.completions.create(
model="your_model",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is the current time?"}
],
tools=[
{
"type": "function",
"function": {
"name": "get_current_time",
"description": "Get the current time.",
"parameters": {
"type": "object",
"properties": {},
"required": [],
},
},
}
],
)
# Check if the model used tools
if response.choices[0].message.tool_calls:
tool_call = response.choices[0].message.tool_calls[0]
print(f"Tool called: {tool_call.function.name}")
Using the Tool Registry
The client includes a tool registry for easier management of tools:
# Define tool functions
def get_weather(location: str, unit: str = "celsius") -> str:
"""Get the current weather in a given location.
Args:
location: The location to get weather for
unit: The temperature unit to use (celsius or fahrenheit)
Returns:
str: A string describing the current weather
"""
return f"The weather in {location} is sunny and 25°{unit[0]}"
# Register tools with the client
client.tool_registry.register(get_weather)
# Use chat_with_tools for automatic tool handling
response = client.chat_with_tools(
messages=[{"role": "user", "content": "What's the weather like in Toronto?"}],
max_tool_calls=5,
)
print(response) # Will contain the final response after any tool calls
File Operations
# Upload a file to OpenWebUI
uploaded_file = client.files.from_path(file)
# Upload multiple files to OpenWebUI
with open("document.pdf", "rb") as file:
uploaded_files = client.files.from_paths(
files=[
(file1, None),
(file2, {"xMetaField": "xMetaValue"})
]
)
uploaded_files.append(uploaded_file)
# Use file with chat completion
response = client.chat.completions.create(
model="gpt-4",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Summarize this document for me."}
],
files=uploaded_files
)
Features
- OpenAI Compatibility: Use the familiar OpenAI Python SDK interfaces
- File Upload: Upload and process files with OpenWebUI
- File-Aware Chat Completions: Reference files in chat completions
- Typed Interface: Full type hints for better IDE integration
Documentation
Full documentation is available at https://bemade.github.io/openwebui-client/
Development
# Install development dependencies
pip install -e ".[dev]"
# Run tests
python -m pytest
# Build documentation
cd docs
make html
License
This project is licensed under the MIT License - see the LICENSE file for details.
Related Projects
- OpenWebUI - A user-friendly WebUI for LLMs
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file openwebui_client-0.3.1.tar.gz.
File metadata
- Download URL: openwebui_client-0.3.1.tar.gz
- Upload date:
- Size: 593.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6716c24a3da7503e8705d0efcb0817988834118230e5fb011cfb137e1dec8520
|
|
| MD5 |
fffdbe13edd3b1534d718427e57a7124
|
|
| BLAKE2b-256 |
1d6c6fa5b5b8fa436bf23dc3e43a2f7bcbcd799c8ebdb6576a8a37ad179c9a62
|
File details
Details for the file openwebui_client-0.3.1-py3-none-any.whl.
File metadata
- Download URL: openwebui_client-0.3.1-py3-none-any.whl
- Upload date:
- Size: 17.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
196e79b80d7c8a92025aa272c00be538d5955c0cc0ff67f84023334d7ab541a3
|
|
| MD5 |
8b41f6fa5904eecce9f46924836ecd8c
|
|
| BLAKE2b-256 |
2969b86ffc1759d57045b5bb3a19b7ef961fd8f7a0f513965c71c9bcdcbca859
|