Skip to main content

Comet tool for logging and evaluating LLM traces

Project description

Comet Opik logo
Opik
Open-source end-to-end LLM Development Platform

Confidently evaluate, test and monitor LLM applications. 

WebsiteSlack communityTwitterDocumentation

Opik thumbnail

🚀 What is Opik?

Opik is an open-source platform for evaluating, testing and monitoring LLM applications. Built by Comet.


You can use Opik for:

  • Development:

    • Tracing: Track all LLM calls and traces during development and production (Quickstart, Integrations
    • Annotations: Annotate your LLM calls by logging feedback scores using the Python SDK or the UI.
  • Evaluation: Automate the evaluation process of your LLM application:

  • Production Monitoring: Monitor your LLM application in production and easily close the feedback loop by adding error traces to your evaluation datasets.

[!TIP]
If you are looking for features that Opik doesn't have today, please raise a new Github discussion topic 🚀


🛠️ Installation

Opik is available as a fully open source local installation or using Comet.com as a hosted solution. The easiest way to get started with Opik is by creating a free Comet account at comet.com.

If you'd like to self-host Opik, you can do so by cloning the repository and starting the platform using Docker Compose:

# Clone the Opik repository
git clone https://github.com/comet-ml/opik.git

# Navigate to the opik/deployment/docker-compose directory
cd opik/deployment/docker-compose

# Start the Opik platform
docker compose up --detach

# You can now visit http://localhost:5173 on your browser!

For more information about the different deployment options, please see our deployment guides:

Installation methods Docs link
Local instance Local Deployment
Kubernetes Kubernetes

🏁 Get Started

To get started, you will need to first install the Python SDK:

pip install opik

Once the SDK is installed, you can configure it by running the opik configure command:

opik configure

This will allow you to configure Opik locally by setting the correct local server address or if you're using the Cloud platform by setting the API Key

[!TIP]
You can also call the opik.configure(use_local=True) method from your Python code to configure the SDK to run on the local installation.

You are now ready to start logging traces using the Python SDK.

📝 Logging Traces

The easiest way to get started is to use one of our integrations. Opik supports:

Integration Description Documentation Try in Colab
OpenAI Log traces for all OpenAI LLM calls Documentation Open Quickstart In Colab
LiteLLM Call any LLM model using the OpenAI format Documentation Open Quickstart In Colab
LangChain Log traces for all LangChain LLM calls Documentation Open Quickstart In Colab
Bedrock Log traces for all Bedrock LLM calls Documentation Open Quickstart In Colab
Anthropic Log traces for all Anthropic LLM calls Documentation Open Quickstart In Colab
Gemini Log traces for all Gemini LLM calls Documentation Open Quickstart In Colab
Groq Log traces for all Groq LLM calls Documentation Open Quickstart In Colab
LangGraph Log traces for all LangGraph executions Documentation Open Quickstart In Colab
LlamaIndex Log traces for all LlamaIndex LLM calls Documentation Open Quickstart In Colab
Ollama Log traces for all Ollama LLM calls Documentation Open Quickstart In Colab
Predibase Fine-tune and serve open-source Large Language Models Documentation Open Quickstart In Colab
Ragas Evaluation framework for your Retrieval Augmented Generation (RAG) pipelines Documentation Open Quickstart In Colab
watsonx Log traces for all watsonx LLM calls Documentation Open Quickstart In Colab

[!TIP]
If the framework you are using is not listed above, feel free to open an issue or submit a PR with the integration.

If you are not using any of the frameworks above, you can also use the track function decorator to log traces:

import opik

opik.configure(use_local=True) # Run locally

@opik.track
def my_llm_function(user_question: str) -> str:
    # Your LLM code here

    return "Hello"

[!TIP]
The track decorator can be used in conjunction with any of our integrations and can also be used to track nested function calls.

🧑‍⚖️ LLM as a Judge metrics

The Python Opik SDK includes a number of LLM as a judge metrics to help you evaluate your LLM application. Learn more about it in the metrics documentation.

To use them, simply import the relevant metric and use the score function:

from opik.evaluation.metrics import Hallucination

metric = Hallucination()
score = metric.score(
    input="What is the capital of France?",
    output="Paris",
    context=["France is a country in Europe."]
)
print(score)

Opik also includes a number of pre-built heuristic metrics as well as the ability to create your own. Learn more about it in the metrics documentation.

🔍 Evaluating your LLM Application

Opik allows you to evaluate your LLM application during development through Datasets and Experiments.

You can also run evaluations as part of your CI/CD pipeline using our PyTest integration.

🤝 Contributing

There are many ways to contribute to Opik:

To learn more about how to contribute to Opik, please see our contributing guidelines.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

opik-1.1.12.tar.gz (136.1 kB view details)

Uploaded Source

Built Distribution

opik-1.1.12-py3-none-any.whl (253.5 kB view details)

Uploaded Python 3

File details

Details for the file opik-1.1.12.tar.gz.

File metadata

  • Download URL: opik-1.1.12.tar.gz
  • Upload date:
  • Size: 136.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.7

File hashes

Hashes for opik-1.1.12.tar.gz
Algorithm Hash digest
SHA256 8c4b82d15197fdc2190a33adfdaa8fc13430d97f0cbf1f18ca5b84ac23b61c08
MD5 a9d5790d137f74cad42bf2d3681a0cca
BLAKE2b-256 6e4c8d6a23b3f1f8aa4ed559ac6f8ed3fb5b3fde10e9df0af43eac9fae3926df

See more details on using hashes here.

File details

Details for the file opik-1.1.12-py3-none-any.whl.

File metadata

  • Download URL: opik-1.1.12-py3-none-any.whl
  • Upload date:
  • Size: 253.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.7

File hashes

Hashes for opik-1.1.12-py3-none-any.whl
Algorithm Hash digest
SHA256 d4e2abbb113fb8c0d564f737f3ed18193205b81658447265fb74065101ffd5eb
MD5 2a1387d4734332bd5fb288b1fec30422
BLAKE2b-256 53d959e519fc42b7c15ffb560e92836f1b34c56cfb004e694cdf3f24a630c9ca

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page