Skip to main content

Simplest token log system for your LLM, embedding model calls.

Project description

tokenlog

Simplest token log system for your LLM, embedding model calls.

Installation

Get on pypi.

pip install tokenlog

How to use

Start with initializing the logger. Each logger with the same name is singleton.

import tokenlog

t_logger = tokenlog.getLogger('session_1', 'gpt-3.5-turbo') # write logger name and model name that you are using
q1 = t_logger.query('This is the query that you used in LLM') # log the query

t_logger.answer('This is an answer from LLM', q1) # log the answer

t_logger.get_token_usage() # get total token usage from all queries

t_logger.get_history() # get history of token usage

t_logger.clear() # clear all histories

Batch logging

You can log multiple queries and answers at once.

import tokenlog

t_logger = tokenlog.getLogger('session_2', 'gpt-3.5-turbo') # write logger name and model name that you are using
query_ids = t_logger.query_batch(['This is the query that you used in LLM', 'This is the second query'])
t_logger.query(['This is the first answer', 'This is the second answer'], query_ids)

Support Models

We support all OpenAI models with tiktoken and Huggingface models that support AutoTokenizer.

Use Case

This library used in AutoRAG project.

To-do

  • Add Handlers for exporting logs
  • Support more models
  • Batch logging

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tokenlog-0.0.2.tar.gz (8.5 kB view hashes)

Uploaded Source

Built Distribution

tokenlog-0.0.2-py3-none-any.whl (5.5 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page