Skip to main content

A tool for logging and exporting AI prompts and responses

Project description

prompt-logger

Prompt Logger reliably captures all the AI prompts you execute so you can focus on experimentation with very minimal instrumentation code.

Features

  • Programmatically log AI/chatbot interactions (prompts and responses) to a SQLite database
  • Command-line interface to export logs to JSONL format
  • Support for multiple user-defined namespaces
  • Decorator for easy integration with existing code

Installation

pip install prompt-logger

Usage

Working with OpenAI-style clients

If you're using the OpenAI client or a client that satisfies the chat.completions.create interface you can attach Prompt Logger to the client and it will automatically record all your prompts and their generated completions.

from prompt_logger import PromptLogger
from openai import OpenAI

# Create the logger with a namespace and a database
logger = PromptLogger(namespace="my-namespace", database="sqlite:///my_prompts.db")

# Attach the logger to an AI client
client = OpenAI()
logger.attach_to_client(client)

# All completion requests to the client are logged
response = client.chat.completions.create(
   model="gpt-4.1",
   messages=[
      {"role": "system", "content": "You are a helpful assistant."},
      {"role": "user", "content": "What is the weather today?"}
   ],
   max_tokens=50
)

# Export used models as a JSONL file
logger.export_models("models.jsonl")

# Export chat prompts to a JSONL file
logger.export_chat_prompts("prompts.jsonl")

Exported models contain some metadata about models used in completion requests.

{"id": "74f0b720-dc78-491d-a123-2c33de50d2ee", "namespace": "my-namespace", "name": "gpt-4.1", "provider": "system", "created": 1744316542.0}

Exported prompts capture parameters used for completion requests and the generated completions.

{"id": "d215d38f-ec59-4d20-9493-1c7f4e9a977f", "namespace": "default", "model": "gpt-4.1", "messages": [{"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "What is the weather today?"}], "generation_kwargs": {"max_tokens": 50}, "completions": [{"role": "assistant", "content": "I don't have access to real-time data or current weather updates. For today's weather, you can:\n\n- Check your preferred weather app (such as The Weather Channel, AccuWeather, or your phone's built-in weather app)\n- Search \"weather", "finish_reason": "length"}], "inference_on": 1745360143.533253, "inference_seconds": 1.272673}

Working with custom code

You can use the save_interaction function to log one-off prompts.

from prompt_logger import PromptLogger, capture

# Initialize the logger
logger = PromptLogger("my-namespace", database="sqlite:///my_prompts.db")

# Log a single prompt and response
logger.save_interaction("What is the weather?", "It's sunny!")

# Export to JSONL
logger.export_text_prompts("prompts.jsonl")

Or use the capture decorator to log prompts more automatically.

# Use the decorator to automatically log prompts
@capture(namespace="my-namespace", database="sqlite:///my_prompts.db")
def generate_text(prompt):
    # Your LLM call here
    return "Generated response"

Using the command line tool

You can use the command line tool to export models and prompts previously logged to the database.

$ prompt-logger export models models.jsonl --namespace=my-namespace --database=sqlite:///my_prompts.db
$ prompt-logger export prompts prompts.jsonl --namespace=my-namespace --database=sqlite:///my_prompts.db

Development

  1. Clone the repository
  2. Install development dependencies:
    pip install -e ".[dev]"
    
  3. Run tests:
    pytest
    

License

This project is licensed under the Apache License 2.0 - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

prompt-logger-0.2.0.tar.gz (24.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

prompt_logger-0.2.0-py3-none-any.whl (24.9 kB view details)

Uploaded Python 3

File details

Details for the file prompt-logger-0.2.0.tar.gz.

File metadata

  • Download URL: prompt-logger-0.2.0.tar.gz
  • Upload date:
  • Size: 24.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for prompt-logger-0.2.0.tar.gz
Algorithm Hash digest
SHA256 82c1892ec0975a17b0e41be84ed79158a50e5b2dfd6be2b82286742fd66acb46
MD5 d94833aadf9ba0d5401aae784d5f8984
BLAKE2b-256 9cb51594bbb09664abfa0b44cfd3f257702ea19c58378940e71efe8325747d5c

See more details on using hashes here.

Provenance

The following attestation bundles were made for prompt-logger-0.2.0.tar.gz:

Publisher: release.yaml on rotationalio/prompt-logger

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file prompt_logger-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: prompt_logger-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 24.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for prompt_logger-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3fff63175ce44a13b5fac326b7cba2fc2bcbfd085674dffca7b25798d234f029
MD5 56f87235ddb82e548272ac129020b4a6
BLAKE2b-256 02ef10db13e032609772c274ad2e9a69321e57d38d247017ae1e007c4394a804

See more details on using hashes here.

Provenance

The following attestation bundles were made for prompt_logger-0.2.0-py3-none-any.whl:

Publisher: release.yaml on rotationalio/prompt-logger

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page