Skip to main content

LlamaIndex LLM integration for Ferro Labs AI Gateway — route LlamaIndex workloads across 30+ LLM providers via a single OpenAI-compatible endpoint

Project description

llama-index-llms-ferrolabsai

PyPI version License

LlamaIndex LLM integration for Ferro Labs AI Gateway — power LlamaIndex RAG, agents, and query engines with 30+ LLM providers through a single OpenAI-compatible endpoint, with automatic fallback, load balancing, cost tracking, and observability.

Status: 0.0.1 placeholder. The full adapter (FerroLabsAI LLM class) is in active development as part of the OSS Ecosystem Roadmap, Appendix A, Phase C. The 0.0.1 release exists to reserve the package name on PyPI and signal upcoming work.


Install

pip install llama-index-llms-ferrolabsai

Planned API

from llama_index.llms.ferrolabsai import FerroLabsAI

llm = FerroLabsAI(
    model="gpt-4o",
    base_url="http://localhost:8080",   # any Ferro Labs AI Gateway instance
    api_key="sk-ferro-...",
)

resp = llm.complete("Explain RAG in one sentence.")
print(resp.text)

# Per-response Ferro metadata (additional_kwargs)
print(resp.additional_kwargs["provider"])
print(resp.additional_kwargs["cost_usd"])
print(resp.additional_kwargs["latency_ms"])
print(resp.additional_kwargs["trace_id"])

Why use this instead of OpenAI(api_base=...)?

llama_index.llms.openai.OpenAI pointed at a Ferro Labs gateway already works as a drop-in. This package adds:

  • First-class provider, cost_usd, latency_ms, trace_id exposure on additional_kwargs.
  • Native support for Ferro extras: route_tag, template_id, template_variables.
  • Streaming + async modes that surface all 30+ providers transparently.
  • A tested compatibility matrix against LlamaIndex's chat / completion / streaming interfaces.

Upstream mirror

Once stable, this package will also be mirrored as a PR to run-llama/llama_index under llama-index-integrations/llms/llama-index-llms-ferrolabsai/ for first-party hub discoverability.

Related

License

Apache-2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_ferrolabsai-0.0.1.tar.gz (5.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_index_llms_ferrolabsai-0.0.1-py3-none-any.whl (4.0 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_llms_ferrolabsai-0.0.1.tar.gz.

File metadata

File hashes

Hashes for llama_index_llms_ferrolabsai-0.0.1.tar.gz
Algorithm Hash digest
SHA256 975fdb161191dc6aced7ab9a3663a4c9de342515385ef016517e2fd3572782f6
MD5 8683ec52027091c5460551006cffc960
BLAKE2b-256 db67dad18292c7df57b4ccda0a563190071ce4d60697bc76f4dd41aa5e314fae

See more details on using hashes here.

Provenance

The following attestation bundles were made for llama_index_llms_ferrolabsai-0.0.1.tar.gz:

Publisher: publish-llama-index-llms-ferrolabsai.yml on ferro-labs/ferrolabs-python-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file llama_index_llms_ferrolabsai-0.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_llms_ferrolabsai-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 eb9e0dbe66d4539f13c25813d1e37375d47c29f1434f4888dd10ee839be01ace
MD5 63f01f1014b772deed5f38263c30d011
BLAKE2b-256 3409e677c282bd25fd2160317b2455d0679daed715dcf40e49a7fffcd2ee2546

See more details on using hashes here.

Provenance

The following attestation bundles were made for llama_index_llms_ferrolabsai-0.0.1-py3-none-any.whl:

Publisher: publish-llama-index-llms-ferrolabsai.yml on ferro-labs/ferrolabs-python-sdk

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page