Skip to main content

The SDK for instrumenting applications for tracking AI costs.

Project description

Nebuly SDK

The SDK for instrumenting applications for tracking AI costs.

Setup

To set up the code quality checks for this project:

  1. Clone the repository
  2. Run the setup command to install the necessary requirements, including Poetry for handling dependencies
make setup

Code Formatting and Linting

The code formatting and linting checks help maintain consistent style and identify potential issues. Black and Ruff are automatically invoked with each commit, but they can also be utilized independently without committing changes:

  • To display the issues detected by the linter
make lint
  • To automatically apply the formatter changes and the suggested changes by the linter, use the following command
make lint-fix

Supported Providers

- OpenAI
- Azure OpenAI
- Cohere
- Anthropic
- HuggingFace pipelines
- HuggingFace HUB
- LangChain
- LlamaIndex
- Amazon Bedrock
- Amazon SageMaker
- Google PALM API
- Google VertexAI

Usage

Make sure you initialize Nebuly before importing other libraries like openai, cohere, huggingface, etc.

Simple usage

In the simple case, you can just import nebuly and call the init function with your API key. This will automatically setup all the tracking for you. After that, you can call the other libraries as normal.

Example with OpenAI

import os
import nebuly

api_key = os.getenv("NEBULY_API_KEY")
nebuly.init(api_key=api_key)

import os
from openai import OpenAI

client = OpenAI()
chat_completion = client.chat.completions.create(
    messages=[
        {
            "role": "user",
            "content": "Say this is a test",
        }
    ],
    model="gpt-3.5-turbo",
    user_id="user-123",
    feature_flags=["new-feature_flag"],
)

Advanced usage: Context managers

In the simple case, each call will be stored as a separate Interaction, you can use context managers to group more calls in a single Interaction:

Example with OpenAI and Cohere

import os
import nebuly
from nebuly.contextmanager import new_interaction

api_key = os.getenv("NEBULY_API_KEY")
nebuly.init(api_key=api_key)

# Setup OpenAI
import openai

openai.api_key = os.getenv("OPENAI_API_KEY")

# Setup Cohere
import cohere

co = cohere.Client(os.getenv("COHERE_API_KEY"))

with new_interaction(user_id="test_user", user_group_profile="test_group") as interaction:
    # interaction.set_input("Some custom input")
    # interaction.set_history([{"role": "user/assistant", "content": "sample content"}}])
    completion_1 = openai.ChatCompletion.create(
        model="gpt-3.5-turbo",
        messages=[
            {"role": "system", "content": "You are an helpful assistant"},
            {"role": "user", "content": "Hello world"}
        ]
    )
    completion_2 = co.generate(
        prompt='Please explain to me how LLMs work',
    )
    # interaction.set_output("Some custom output")

LangChain Callbacks

import os

from langchain.chains import LLMChain
from langchain.chat_models import ChatOpenAI
from langchain.prompts import PromptTemplate
from nebuly.providers.langchain import LangChainTrackingHandler

callback = LangChainTrackingHandler(
    user_id="test_user",
    api_key=os.getenv("NEBULY_API_KEY"),
)

llm = ChatOpenAI(temperature=0.9)
prompt = PromptTemplate(
    input_variables=["product"],
    template="What is a good name for a company that makes {product}?",
)

chain = LLMChain(llm=llm, prompt=prompt)
result = chain.run(
    "colorful socks",
    callbacks=[callback],
)

LlamaIndex Callbacks

import os
from nebuly.providers.llama_index import LlamaIndexTrackingHandler

handler = LlamaIndexTrackingHandler(
    api_key=os.getenv("NEBULY_API_KEY"), user_id="test_user"
)

import llama_index
from llama_index import SimpleDirectoryReader, VectorStoreIndex

llama_index.global_handler = handler

documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()
response = query_engine.query("What did the author do growing up?")

Variants Usage

from nebuly.ab_testing import ABTesting

client = ABTesting("your_nebuly_api_key")

variants = client.get_variants(
  user="<user_id>",
  feature_flags=["feature_flag_a", "feature_flag_b"]
)
print(variants)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nebuly-0.3.35.tar.gz (34.0 kB view details)

Uploaded Source

Built Distribution

nebuly-0.3.35-py3-none-any.whl (48.0 kB view details)

Uploaded Python 3

File details

Details for the file nebuly-0.3.35.tar.gz.

File metadata

  • Download URL: nebuly-0.3.35.tar.gz
  • Upload date:
  • Size: 34.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.0 CPython/3.11.10 Linux/6.5.0-1025-azure

File hashes

Hashes for nebuly-0.3.35.tar.gz
Algorithm Hash digest
SHA256 4afeac8a63caacb4dd149368dacbb6e8304cd77e454f9d7b5b17560581dbe7f3
MD5 9e9faef8922f8a61e7d5a33891e5d385
BLAKE2b-256 03bdf68f31240e7f91e4ac365fc9ea204db14b43c65c718b493d66fd4ad00966

See more details on using hashes here.

File details

Details for the file nebuly-0.3.35-py3-none-any.whl.

File metadata

  • Download URL: nebuly-0.3.35-py3-none-any.whl
  • Upload date:
  • Size: 48.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.0 CPython/3.11.10 Linux/6.5.0-1025-azure

File hashes

Hashes for nebuly-0.3.35-py3-none-any.whl
Algorithm Hash digest
SHA256 f206037655c21a31c7e63fdd626244cc16ac0f7a19bde91522b1269a7c59975c
MD5 daf31f08a593b6a4ceee72515e5a95c1
BLAKE2b-256 895e39d3eaf8bcbd8a89fe4177f41e444008b493b80b817b05511763c393d5bb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page