Python SDK for the Big-O LLM Gateway — OpenAI-compatible access to AWS Bedrock and Azure OpenAI with LlamaIndex and Haystack integrations
Project description
Big-O Python SDK
Python SDK for the Big-O LLM Gateway, a 100% OpenAI-compatible API that provides unified access to multiple LLM providers (AWS Bedrock, Azure OpenAI).
Drop-in replacements for LlamaIndex and Haystack with automatic JWT authentication.
Installation
pip install big-o-sdk
Of met uv:
uv add big-o-sdk
For development:
git clone https://gitlab.com/virtuele-gemeente-assistent/big-o-sdk.git
cd big-o-sdk
uv sync
Configuration
Set your credentials as environment variables or in a .env file:
BIG_O_CLIENT_ID=your-client-id
BIG_O_CLIENT_SECRET=your-client-secret
BIG_O_CHAT_MODEL=gpt-4o-mini
BIG_O_EMBEDDING_MODEL=text-embedding-3-large
Quick Start
LlamaIndex Agent
from llama_index.core.agent.workflow import FunctionAgent
from big_o_sdk import BigOConfig, BigOLLM
config = BigOConfig.from_env()
llm = BigOLLM(config)
def add(a: float, b: float) -> float:
"""Add two numbers."""
return a + b
agent = FunctionAgent(tools=[add], llm=llm)
response = await agent.run(user_msg="What is 3 + 4?")
await llm.close()
LlamaIndex Embedding
from big_o_sdk import BigOConfig, BigOEmbedding
config = BigOConfig.from_env()
embed = BigOEmbedding(config)
vector = await embed.aget_query_embedding("What is AI?")
await embed.close()
Haystack Embedding
from big_o_sdk import BigOConfig
from big_o_sdk.embedders.haystack import BigOTextEmbedder, BigODocumentEmbedder
config = BigOConfig.from_env()
# Query embedding
embedder = BigOTextEmbedder(config)
result = embedder.run(text="What is AI?")
# Document embedding
from haystack import Document
doc_embedder = BigODocumentEmbedder(config)
result = doc_embedder.run(documents=[Document(content="Hello world")])
Gateway Status
from big_o_sdk import BigOClient, BigOConfig
config = BigOConfig.from_env()
client = BigOClient(config)
await client.health() # backend health (no auth)
await client.health_llm() # LLM availability (no auth)
await client.ping() # verify JWT token
models = await client.models() # list available models
await client.close()
Available Models
List available models via the API:
from big_o_sdk import BigOClient, BigOConfig
config = BigOConfig.from_env()
client = BigOClient(config)
for m in await client.models():
print(f" {m['id']} ({m['owned_by']})")
await client.close()
Or check the Swagger docs: https://devops.versnellers.nl/bigo/api/docs
Usage in Existing Projects
The SDK is designed as a drop-in provider for projects that already use LlamaIndex or Haystack with OpenAI/Azure. Add "big_o_sdk" as a third option in your existing provider factory.
Example: genai-adapter
The genai-adapter uses LlamaIndex for agents and Haystack for embeddings. Two files need a "big_o_sdk" branch:
1. LLM for agents (src/core/agents/base.py):
from big_o_sdk import BigOConfig
from big_o_sdk.agents.llamaindex import BigOLLM
class BaseAgent:
def _initialize_llm(self, callback_manager):
if settings.llm_provider == "openai":
return OpenAI(model=..., api_key=..., callback_manager=callback_manager)
elif settings.llm_provider == "azure":
return AzureOpenAI(engine=..., api_key=..., callback_manager=callback_manager)
elif settings.llm_provider == "bigo":
config = BigOConfig.from_env()
return BigOLLM(config, callback_manager=callback_manager)
BigOLLM is a drop-in for LlamaIndex's OpenAI/AzureOpenAI with the same achat(), acomplete(), astream_chat() interface. The FunctionAgent, tools, and workflows don't change.
2. Embeddings for search (src/services/indexer_haystack.py):
from big_o_sdk import BigOConfig
from big_o_sdk.embedders.haystack import BigOTextEmbedder
class HaystackIndexer:
def _create_query_pipeline(self, collection_name):
pipeline = AsyncPipeline()
if settings.llm_provider == "openai":
pipeline.add_component("dense_embedder", OpenAITextEmbedder(...))
elif settings.llm_provider == "azure":
pipeline.add_component("dense_embedder", AzureOpenAITextEmbedder(...))
elif settings.llm_provider == "bigo":
config = BigOConfig.from_env()
pipeline.add_component("dense_embedder", BigOTextEmbedder(config))
# Everything below stays identical: sparse embedder, retriever, filters
pipeline.add_component("sparse_embedder", FastembedSparseTextEmbedder(model="Qdrant/bm25"))
pipeline.add_component("retriever", QdrantHybridRetriever(document_store=store))
pipeline.connect("dense_embedder.embedding", "retriever.query_embedding")
pipeline.connect("sparse_embedder.sparse_embedding", "retriever.query_sparse_embedding")
Only the dense embedder changes. The sparse embedder (BM25), hybrid retriever, Qdrant filters (organization, doc_type, story_id), and document filtering all stay exactly the same.
3. Switch via environment variable:
AI_PROVIDER=bigo
BIG_O_CLIENT_ID=your-client-id
BIG_O_CLIENT_SECRET=your-client-secret
Switch back to Azure anytime with AI_PROVIDER=azure.
BigOConfig
All classes receive a single BigOConfig object that holds credentials and defaults. Two ways to create it:
From environment variables (production)
config = BigOConfig.from_env()
from_env() reads these environment variables and bundles them into one immutable object:
| Environment Variable | Config Field | Required | Default |
|---|---|---|---|
BIG_O_CLIENT_ID |
config.client_id |
Yes | |
BIG_O_CLIENT_SECRET |
config.client_secret |
Yes | |
BIG_O_BASE_URL |
config.base_url |
No | https://devops.versnellers.nl/bigo/api |
BIG_O_CHAT_MODEL |
config.chat_model |
No | gpt-4o-mini |
BIG_O_EMBEDDING_MODEL |
config.embedding_model |
No | text-embedding-3-large |
BIG_O_CONTEXT_WINDOW |
config.context_window |
No | 128000 |
Direct instantiation (testing / overrides)
config = BigOConfig(
client_id="test-id",
client_secret="test-secret",
chat_model="claude-haiku-4.5",
embedding_model="text-embedding-3-small",
)
Pass it to any class
One config, shared everywhere with no repeated parameters:
config = BigOConfig.from_env()
llm = BigOLLM(config) # uses config.chat_model
embed = BigOEmbedding(config) # uses config.embedding_model
embedder = BigOTextEmbedder(config) # uses config.embedding_model
client = BigOClient(config) # uses config.base_url
# Override per-instance if needed
llm = BigOLLM(config, model="claude-haiku-4.5")
Authentication
JWT tokens are managed automatically. The SDK:
- Fetches a token using
config.client_idandconfig.client_secret(OAuth2 Client Credentials) - Caches it and refreshes 30 seconds before expiry
- Updates the underlying OpenAI client transparently
You never handle tokens manually.
Project Structure
src/big_o_sdk/
|-- config.py # BigOConfig
|-- errors.py # Domain exceptions
|-- auth.py # BigOAuthenticator
|-- client.py # BigOClient (health, ping, models)
|-- mixins.py # Shared token refresh logic
|-- agents/
| |-- llamaindex.py # BigOLLM
|-- embedders/
|-- llamaindex.py # BigOEmbedding
|-- haystack.py # BigOTextEmbedder, BigODocumentEmbedder
Examples
uv run examples/agents/llamaindex_agent.py
uv run examples/embedders/llamaindex_embedding.py
uv run examples/embedders/haystack_embedding.py
Documentation
Related Projects
- genai-adapter: RAG chatbot using LlamaIndex agents and Haystack embeddings
- scrapy: Web scraper with Haystack document indexing and hybrid search
Both projects can use Big-O as an alternative AI provider by setting AI_PROVIDER=bigo.
License
EUPL v1.2, see LICENSE for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file big_o_sdk-1.1.0.tar.gz.
File metadata
- Download URL: big_o_sdk-1.1.0.tar.gz
- Upload date:
- Size: 23.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
85e6c4652c8d6c0f344db40a178a46b02cacbdf69479d52313d340f228fc1088
|
|
| MD5 |
db14832bf5e70e36afe5521421dc04f4
|
|
| BLAKE2b-256 |
ebc759da72e59ef95520458058f25afc58a4c823add22913b4b735bafd3e85c0
|
File details
Details for the file big_o_sdk-1.1.0-py3-none-any.whl.
File metadata
- Download URL: big_o_sdk-1.1.0-py3-none-any.whl
- Upload date:
- Size: 22.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ef60d04f8c49aeeec9e9983753757f3040b93dfefcd130919ca740ae76fc4033
|
|
| MD5 |
83f6553843a5865c91ab52bc7f22dba1
|
|
| BLAKE2b-256 |
e263a396207bd0f44e9cbb795007e74d9dcd06685f4a5e010fb39f6b64ab5008
|