Skip to main content

Wrapper library for openai to send events to the Imaginary Programming monitor

Project description

Imaginary Dev OpenAI wrapper



Documentation Status

Wrapper library for openai to send events to the Imaginary Programming monitor


  • Patches the openai library to allow user to set an ip_project_key for each request
  • Works out of the box with langchain

Get Started

At startup, before any openai calls, patch the library with the following code:

from im_openai import patch_openai

Then, set the ip_project_key for each request:

import openai

completion = openai.ChatCompletion.create(
    prompt="This is a test",

If you're using langchain, you can set the ip_project_key in the langchain llm setup:

llm = OpenAI(
    model_kwargs={"ip_project_key": "my_project_key"},


This package was created with Cookiecutter_ and the audreyr/cookiecutter-pypackage_ project template.

.. _Cookiecutter: .. _audreyr/cookiecutter-pypackage:

======= History =======

0.1.0 (2023-06-20)

  • First release on PyPI.

0.1.1 (2023-06-23)

  • add TemplateString helper and support for data / params

0.1.2 (2023-06-23)

  • add support for original template too

0.2.0 (2023-06-26)

  • add explicit support for passing the "prompt template text"

0.3.0 (2023-06-28)

  • add support for chat templates (as objects instead of arrays)

0.4.0 (2023-06-29)

  • switch event reporting to be async / non-blocking

0.4.1 (2023-06-29)

  • add utility for formatting langchain messages

0.4.2 (2023-06-29)

  • remove stray breakpoint

0.4.3 (2023-06-30)

  • pass along chat_id
  • attempt to auto-convert langchain prompt templates

0.4.4 (2023-06-30)

  • remove stray prints

0.5.0 (2023-07-06)

  • Add langchain callbacks handlers

0.6.0 (2023-07-10)

  • Handle duplicate callbacks, agents, etc

0.6.1 (2023-07-12)

  • Fix prompt retrieval in deep chains

0.6.2 (2023-07-13)

  • Handle cases where input values are not strings

0.6.3 (2023-07-18)

  • Better support for server-generated event ids (pre-llm sends event, post-llm re-uses the same id)
  • more tests for different kinds of templates


  • include temporary patched version of loads()


  • breaking change: move im_openai.langchain_util to im_openai.langchain
  • add support for injecting callbacks into all langchain calls using tracing hooks


  • Pass along model params to the server

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

im_openai-0.7.2.tar.gz (22.7 kB view hashes)

Uploaded Source

Built Distribution

im_openai-0.7.2-py2.py3-none-any.whl (14.8 kB view hashes)

Uploaded Python 2 Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page