LlamaIndex LLM integration for Ferro Labs AI Gateway — route LlamaIndex workloads across 30+ LLM providers via a single OpenAI-compatible endpoint
Project description
llama-index-llms-ferrolabsai
LlamaIndex LLM integration for Ferro Labs AI Gateway — power LlamaIndex RAG, agents, and query engines with 30+ LLM providers through a single OpenAI-compatible endpoint, with automatic fallback, load balancing, cost tracking, and observability.
Status: 0.0.1 placeholder. The full adapter (
FerroLabsAILLM class) is in active development as part of the OSS Ecosystem Roadmap, Appendix A, Phase C. The 0.0.1 release exists to reserve the package name on PyPI and signal upcoming work.
Install
pip install llama-index-llms-ferrolabsai
Planned API
from llama_index.llms.ferrolabsai import FerroLabsAI
llm = FerroLabsAI(
model="gpt-4o",
base_url="http://localhost:8080", # any Ferro Labs AI Gateway instance
api_key="sk-ferro-...",
)
resp = llm.complete("Explain RAG in one sentence.")
print(resp.text)
# Per-response Ferro metadata (additional_kwargs)
print(resp.additional_kwargs["provider"])
print(resp.additional_kwargs["cost_usd"])
print(resp.additional_kwargs["latency_ms"])
print(resp.additional_kwargs["trace_id"])
Why use this instead of OpenAI(api_base=...)?
llama_index.llms.openai.OpenAI pointed at a Ferro Labs gateway already works as a drop-in. This package adds:
- First-class
provider,cost_usd,latency_ms,trace_idexposure onadditional_kwargs. - Native support for Ferro extras:
route_tag,template_id,template_variables. - Streaming + async modes that surface all 30+ providers transparently.
- A tested compatibility matrix against LlamaIndex's chat / completion / streaming interfaces.
Upstream mirror
Once stable, this package will also be mirrored as a PR to
run-llama/llama_index under
llama-index-integrations/llms/llama-index-llms-ferrolabsai/ for first-party hub discoverability.
Related
ferrolabsai— the core Python SDK this package wraps.- Ferro Labs AI Gateway — the open-source gateway server.
- Documentation
License
Apache-2.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llama_index_llms_ferrolabsai-0.0.1.tar.gz.
File metadata
- Download URL: llama_index_llms_ferrolabsai-0.0.1.tar.gz
- Upload date:
- Size: 5.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
975fdb161191dc6aced7ab9a3663a4c9de342515385ef016517e2fd3572782f6
|
|
| MD5 |
8683ec52027091c5460551006cffc960
|
|
| BLAKE2b-256 |
db67dad18292c7df57b4ccda0a563190071ce4d60697bc76f4dd41aa5e314fae
|
Provenance
The following attestation bundles were made for llama_index_llms_ferrolabsai-0.0.1.tar.gz:
Publisher:
publish-llama-index-llms-ferrolabsai.yml on ferro-labs/ferrolabs-python-sdk
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llama_index_llms_ferrolabsai-0.0.1.tar.gz -
Subject digest:
975fdb161191dc6aced7ab9a3663a4c9de342515385ef016517e2fd3572782f6 - Sigstore transparency entry: 1531806420
- Sigstore integration time:
-
Permalink:
ferro-labs/ferrolabs-python-sdk@432c7a3cf39142c96d40b82b5282a719535182ae -
Branch / Tag:
refs/tags/llama-index-llms-ferrolabsai-v0.0.1 - Owner: https://github.com/ferro-labs
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-llama-index-llms-ferrolabsai.yml@432c7a3cf39142c96d40b82b5282a719535182ae -
Trigger Event:
push
-
Statement type:
File details
Details for the file llama_index_llms_ferrolabsai-0.0.1-py3-none-any.whl.
File metadata
- Download URL: llama_index_llms_ferrolabsai-0.0.1-py3-none-any.whl
- Upload date:
- Size: 4.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
eb9e0dbe66d4539f13c25813d1e37375d47c29f1434f4888dd10ee839be01ace
|
|
| MD5 |
63f01f1014b772deed5f38263c30d011
|
|
| BLAKE2b-256 |
3409e677c282bd25fd2160317b2455d0679daed715dcf40e49a7fffcd2ee2546
|
Provenance
The following attestation bundles were made for llama_index_llms_ferrolabsai-0.0.1-py3-none-any.whl:
Publisher:
publish-llama-index-llms-ferrolabsai.yml on ferro-labs/ferrolabs-python-sdk
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llama_index_llms_ferrolabsai-0.0.1-py3-none-any.whl -
Subject digest:
eb9e0dbe66d4539f13c25813d1e37375d47c29f1434f4888dd10ee839be01ace - Sigstore transparency entry: 1531806519
- Sigstore integration time:
-
Permalink:
ferro-labs/ferrolabs-python-sdk@432c7a3cf39142c96d40b82b5282a719535182ae -
Branch / Tag:
refs/tags/llama-index-llms-ferrolabsai-v0.0.1 - Owner: https://github.com/ferro-labs
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish-llama-index-llms-ferrolabsai.yml@432c7a3cf39142c96d40b82b5282a719535182ae -
Trigger Event:
push
-
Statement type: