Skip to main content

OpenAI Token Counter

Project description

OpenAI Token Counter

PyPI Status Python Version License

Tests Codecov

pre-commit Black

Token counter for OpenAI messages with support for function token calculation. This project was ported to python based on the following repository: https://github.com/hmarr/openai-chat-tokens

As stated in hmarr project:

Estimating token usage for chat completions isn't quite as easy as it sounds. For regular chat messages, you need to consider how the messages are formatted by OpenAI when they're provided to the model, as they don't simply dump the JSON messages they receive via the API into the model. For function calling, things are even more complex, as the OpenAPI-style function definitions get rewritten into TypeScript type definitions. This library handles both of those cases, as well as a minor adjustment needed for handling the results of function calling. tiktoken is used to do the tokenization.

This library is tested nightly againts the openai API to detect for potential breaks if any internal change is made by openai, because as stated above we implement token calculation based on internal OpenAI techniques that are not exposed and can potentially change without notice.

Installation

You can install OpenAI Token Counter via [pip] from [PyPI]:

$ pip install openai-token-counter

Usage

from openai_token_counter import openai_token_counter

messages = [{"role": "user", "content": "hello"}]
functions = [
    {
        "name": "bing_bong",
        "description": "Do a bing bong",
        "parameters": {
            "type": "object",
            "properties": {
                "foo": {"type": "string"},
                "bar": {"type": "number", "description": "A number"},
            }
        }
    }
]

result = openai_token_counter(
    messages=messages,
    model="gpt-3.5-turbo", # Optional, deafults to cl100k_base encoding which is used by GPT models
    functions=functions, # Optional
    function_call="auto" # Optional
)

print(result) # Output: '57'

Contributing

Contributions are very welcome.

  1. Install poetry
  2. Install the project dependencies
poetry install
  1. Make the changes
  2. Test locally using nox (no need to test all python versions, select only 3.10):
nox --python=3.10
  1. Create a PR in GitHub.

License

Distributed under the terms of the MIT, OpenAI Token Counter is free and open source software.

Issues

If you encounter any problems, please [file an issue] along with a detailed description.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openai_token_counter-1.0.2.tar.gz (6.4 kB view hashes)

Uploaded Source

Built Distribution

openai_token_counter-1.0.2-py3-none-any.whl (7.6 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page