Skip to main content

Estimate costs and running times of complex LLM workflows/experiments/pipelines in advance before spending money, via simulations.

Project description

costly

Estimate costs and running times of complex LLM workflows/experiments/pipelines in advance before spending money, via simulations. Just put @costly() on the load-bearing function; make sure all functions that call it pass **kwargs to it and call your complex function with simulate=True and some cost_log: Costlog object. See examples.ipynb for more details.

https://github.com/abhimanyupallavisudhir/costly

Installation

pip install costly

Usage

See examples.ipynb for a full walkthrough; some examples below.

from costly import Costlog, costly, CostlyResponse
from costly.estimators.llm_api_estimation import LLM_API_Estimation as estimator


@costly()
def chatgpt(input_string: str, model: str) -> str:
    from openai import OpenAI

    client = OpenAI()
    response = client.chat.completions.create(
        model=model, messages=[{"role": "user", "content": input_string}]
    )
    output_string = response.choices[0].message.content
    return output_string


@costly(
    input_string=lambda kwargs: estimator.messages_to_input_string(
        kwargs["messages"]
    ),
)
def chatgpt_messages(messages: list[dict[str, str]], model: str) -> str:
    from openai import OpenAI

    client = OpenAI()
    response = client.chat.completions.create(model=model, messages=messages)
    output_string = response.choices[0].message.content
    return output_string


@costly()
def chatgpt(input_string: str, model: str) -> str:
    from openai import OpenAI

    client = OpenAI()
    response = client.chat.completions.create(
        model=model,
        messages=[
            {"role": "user", "content": input_string},
        ],
    )

    return CostlyResponse(
        output=response.choices[0].message.content,
        cost_info={
            "input_tokens": response.usage.prompt_tokens,
            "output_tokens": response.usage.completion_tokens,
        },
    ) # in usage, this will still just return the output, not the whole CostlyResponse object

Testing

poetry run pytest -s

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

costly-0.1.6.tar.gz (9.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

costly-0.1.6-py3-none-any.whl (11.8 kB view details)

Uploaded Python 3

File details

Details for the file costly-0.1.6.tar.gz.

File metadata

  • Download URL: costly-0.1.6.tar.gz
  • Upload date:
  • Size: 9.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.11.0 Windows/10

File hashes

Hashes for costly-0.1.6.tar.gz
Algorithm Hash digest
SHA256 5f65f7780afc1748b64d5b0efdfb49f8106d5ef157404d9b663b3518d274bb47
MD5 0b9f2fc166040e87f86fca0f25b2b71c
BLAKE2b-256 22eeccec7e65b37b692c398d71d8fa386fc298b99900e652be219071d51d5bab

See more details on using hashes here.

File details

Details for the file costly-0.1.6-py3-none-any.whl.

File metadata

  • Download URL: costly-0.1.6-py3-none-any.whl
  • Upload date:
  • Size: 11.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.11.0 Windows/10

File hashes

Hashes for costly-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 31de87de3aeb72ae29234a4781e8891763893e22ee59d895ce05b8b500918de6
MD5 2e4dcf2288705fdd4bf9bb7ed44a8891
BLAKE2b-256 da3ab5656d13b82d51bfb6bc88efa8bd143b3ab8bad38c04ba6ffa58fd13584b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page