Skip to main content

promptwatch.io python client to trace langchain sessions

Project description

PromptWatch.io ... session tracking for LangChain

It enables you to:

  • track all the chain executions
  • track LLM Prompts and re-play the LLM runs with the same input parameters and model settings to tweak your prompt template
  • track your costs per project and per tenant (your customer)

Installation

pip install promptwatch

Basic usage

In order to enable session tracking wrap you chain executions in PromptWatch block

from langchain import OpenAI, LLMChain, PromptTemplate
from promptwatch import PromptWatch

prompt_template = PromptTemplate.from_template("Finish this sentence {input}")
my_chain = LLMChain(llm=OpenAI(), prompt=prompt_template)

with PromptWatch(api_key="<your-api-key>") as pw:
    my_chain("The quick brown fox jumped over")

Here you can get your API key: http://www.promptwatch.io/get-api-key (no registration needed)

You can set it directly into PromptWatch constructor, or set is as an ENV variable PROMPTWATCH_API_KEY

Comprehensive Chain Execution Tracking

With PromptWatch.io, you can track all chains, actions, retrieved documents, and more to gain complete visibility into your system. This makes it easy to identify issues with your prompts and quickly fix them for optimal performance.

What sets PromptWatch.io apart is its intuitive and visual interface. You can easily drill down into the chains to find the root cause of any problems and get a clear understanding of what's happening in your system.

Read more here: Chain tracing documentation

LLM Prompt caching

It is often tha case that some of the prompts are repeated over an over. It is costly and slow. With PromptWatch you just wrap your LLM model into our CachedLLM interface and it will automatically reuse previously generated values.

Read more here: Prompt caching documentation

LLM Prompt Template Tweaking

Tweaking prompt templates to find the optimal variation can be a time-consuming and challenging process, especially when dealing with multi-stage LLM chains. Fortunately, PromptWatch.io can help simplify the process!

With PromptWatch.io, you can easily experiment with different prompt variants by replaying any given LLM chain with the exact same inputs used in real scenarios. This allows you to fine-tune your prompts until you find the variation that works best for your needs.

Read more here: Prompt tweaking documentation

Keep Track of Your Prompt Template Changes

Making changes to your prompt templates can be a delicate process, and it's not always easy to know what impact those changes will have on your system. Version control platforms like GIT are great for tracking code changes, but they're not always the best solution for tracking prompt changes.

Read more here: Prompt template versioning documentation

Unit testing

Unit tests will help you understand what impact your changes in Prompt templates and your code can have on representative sessions examples.

Read more here: Unit tests documentation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

promptwatch-0.2.5.tar.gz (33.4 kB view details)

Uploaded Source

Built Distribution

promptwatch-0.2.5-py3-none-any.whl (37.8 kB view details)

Uploaded Python 3

File details

Details for the file promptwatch-0.2.5.tar.gz.

File metadata

  • Download URL: promptwatch-0.2.5.tar.gz
  • Upload date:
  • Size: 33.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.16

File hashes

Hashes for promptwatch-0.2.5.tar.gz
Algorithm Hash digest
SHA256 5bafd37b71a6bd39ecdf6a96234122460c54149a7412559c6656723e365b0d16
MD5 b9e0b6f24d9a9bbc99e7f50b42d3433f
BLAKE2b-256 1618c7faff86fffaeee84c0f6934edf97207e67890c29511fbff738a9e337a84

See more details on using hashes here.

File details

Details for the file promptwatch-0.2.5-py3-none-any.whl.

File metadata

  • Download URL: promptwatch-0.2.5-py3-none-any.whl
  • Upload date:
  • Size: 37.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.16

File hashes

Hashes for promptwatch-0.2.5-py3-none-any.whl
Algorithm Hash digest
SHA256 aa0c9edf642bd5800d72d74c79af68c29c6606bc9ab32af3b8fc78fa52325d7d
MD5 3896cb957ab0f9c79c038bec50c93947
BLAKE2b-256 d4000c99f55f3208c1dcf481cb7b93cb1ba047b5acc1ecf9cebaaa4e0c65c0cc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page