Skip to main content

llama-index llms helicone (OpenAI-compatible) integration

Project description

LlamaIndex LLMs Integration: Helicone

Installation

To install the required packages, run:

pip install llama-index-llms-helicone
pip install llama-index

Setup

Initialize Helicone

Set your Helicone API key via HELICONE_API_KEY (or pass directly). No provider API keys are needed when using the Helicone AI Gateway.

from llama_index.llms.helicone import Helicone
from llama_index.core.llms import ChatMessage

llm = Helicone(
    api_key="<helicone-api-key>",  # or set HELICONE_API_KEY env var
    model="gpt-4o-mini",  # works across providers via gateway
)

Generate Chat Responses

You can generate a chat response by sending a list of ChatMessage instances:

message = ChatMessage(role="user", content="Tell me a joke")
resp = llm.chat([message])
print(resp)

Streaming Responses

To stream responses, use the stream_chat method:

message = ChatMessage(role="user", content="Tell me a story in 250 words")
resp = llm.stream_chat([message])
for r in resp:
    print(r.delta, end="")

Complete with Prompt

You can also generate completions with a prompt using the complete method:

resp = llm.complete("Tell me a joke")
print(resp)

Streaming Completion

To stream completions, use the stream_complete method:

resp = llm.stream_complete("Tell me a story in 250 words")
for r in resp:
    print(r.delta, end="")

Model Configuration

To use a specific model, you can specify it during initialization. For example, to use Mistral's Mixtral model, you can set it like this:

from llama_index.llms.helicone import Helicone

llm = Helicone(model="gpt-4o-mini")
resp = llm.complete("Write a story about a dragon who can code in Rust")
print(resp)

Notes

  • Default Helicone base URL is https://ai-gateway.helicone.ai/v1. Override with api_base or HELICONE_API_BASE if needed.
  • Only HELICONE_API_KEY is required. The gateway routes to the correct provider based on the model string.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_llms_helicone-0.1.1.tar.gz (5.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_index_llms_helicone-0.1.1-py3-none-any.whl (4.8 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_llms_helicone-0.1.1.tar.gz.

File metadata

  • Download URL: llama_index_llms_helicone-0.1.1.tar.gz
  • Upload date:
  • Size: 5.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.9 {"installer":{"name":"uv","version":"0.9.9"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_helicone-0.1.1.tar.gz
Algorithm Hash digest
SHA256 9af6c269bdf8de9d22b718d2cc3cbf4c9f19a7f70ed6fcb5ba8fc759ba9226a6
MD5 c8c706b22d0614bb9d40f665714b7d46
BLAKE2b-256 528548bddfd8881ddfe4451451ec9fb2b3db09ca5788e346f1613c8fa3313205

See more details on using hashes here.

File details

Details for the file llama_index_llms_helicone-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: llama_index_llms_helicone-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 4.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.9 {"installer":{"name":"uv","version":"0.9.9"},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for llama_index_llms_helicone-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0ef88ae9022bdf6b76ee33a757f488d887aab4b2a2d83b65aa0e5fe8b449d5d8
MD5 9e424a9ca69d95cf47bbd7d66d3a80b8
BLAKE2b-256 46a984ceb20f90693ec3c899c3d9371ea3da724c2fc7c099027ee38f341c907f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page