Skip to main content

Simplest token log system for your LLM, embedding model calls.

Project description

tokenlog

Simplest token log system for your LLM, embedding model calls.

Installation

Get on pypi.

pip install tokenlog

How to use

Start with initializing the logger. Each logger with the same name is singleton.

import tokenlog

t_logger = tokenlog.getLogger('session_1', 'gpt-3.5-turbo') # write logger name and model name that you are using
q1 = t_logger.query('This is the query that you used in LLM') # log the query

t_logger.answer('This is an answer from LLM', q1) # log the answer

t_logger.get_token_usage() # get total token usage from all queries

t_logger.get_history() # get history of token usage

t_logger.clear() # clear all histories

Batch logging

You can log multiple queries and answers at once.

import tokenlog

t_logger = tokenlog.getLogger('session_2', 'gpt-3.5-turbo') # write logger name and model name that you are using
query_ids = t_logger.query_batch(['This is the query that you used in LLM', 'This is the second query'])
t_logger.query(['This is the first answer', 'This is the second answer'], query_ids)

Chat Support

We also support chat format logging. You can use the OpenAI type chat format.

import tokenlog

t_logger = tokenlog.getLogger('session_3', 'gpt-5')
chat1 = t_logger.query([
    {"role": "system", "content": "You are a helpful assistant."},
    {"role": "user", "content": "Who won the world series in 2020?"},
])
t_logger.chat([
    {"role": "assistant", "content": "The 2020 World Series was played at Globe Life Field in Arlington, Texas."}
], chat1)

Support Models

We support all OpenAI models with tiktoken and Huggingface models that support AutoTokenizer.

Use Case

This library used in AutoRAG project.

To-do

  • Add Handlers for exporting logs
  • Support more models

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tokenlog-0.0.3.tar.gz (82.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tokenlog-0.0.3-py3-none-any.whl (6.0 kB view details)

Uploaded Python 3

File details

Details for the file tokenlog-0.0.3.tar.gz.

File metadata

  • Download URL: tokenlog-0.0.3.tar.gz
  • Upload date:
  • Size: 82.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for tokenlog-0.0.3.tar.gz
Algorithm Hash digest
SHA256 c740befdb9ab39bfac36ef122c1f63a52b44e84929a7c08e282502f2abe2bdbb
MD5 f5a889d86c13531b17c7c14d393a906d
BLAKE2b-256 53cd2e931ee02aa73f5777f089eb8de99885051861e2ed164c0796f2610398b3

See more details on using hashes here.

File details

Details for the file tokenlog-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: tokenlog-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 6.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.4

File hashes

Hashes for tokenlog-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 1e7f471b1d509b916848f17b729788cb6ff7aa83f5b27a0ab7060c72c3cbac51
MD5 3a113bc7f09e21ed1d46ef3fe4ec02a7
BLAKE2b-256 196fd6fe5b804ce96565449a7185bbab717158c93ff95eb75fe61436ac9b59a1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page