Skip to main content

Chat Plugin Experiments, Simplified. Give all the power to the models in your life.

Project description

ChatLab

Chat Experiments, Simplified

💬🔬

ChatLab is a Python package that makes it easy to experiment with OpenAI's chat models. It provides a simple interface for chatting with the models and a way to register functions that can be called from the chat model.

Best yet, it's interactive in the notebook!

Notebooks to get started with

Introduction

import chatlab
import random

def flip_a_coin():
    '''Returns heads or tails'''
    return random.choice(['heads', 'tails'])

chat = chatlab.Chat()
chat.register(flip_a_coin)

await chat("Please flip a coin for me")
 𝑓  Ran `flip_a_coin`

Input:

{}

Output:

"tails"
It landed on tails!

In the notebook, text will stream into a Markdown output and function inputs and outputs are a nice collapsible display, like with ChatGPT Plugins.

TODO: Include GIF/mp4 of this in action

Installation

pip install chatlab

Configuration

You'll need to set your OPENAI_API_KEY environment variable. You can find your API key on your OpenAI account page. I recommend setting it in an .env file when working locally.

On hosted notebook environments, set it in your Secrets to keep it safe from prying LLM eyes.

What can Chats enable you to do?

💬

Where Chats take it next level is with Chat Functions. You can

  • declare a function
  • register the function in your Chat
  • watch as Chat Models call your functions!

You may recall this kind of behavior from ChatGPT Plugins. Now, you can take this even further with your own custom code.

As an example, let's give the large language models the ability to tell time.

from datetime import datetime
from pytz import timezone, all_timezones, utc
from typing import Optional
from pydantic import BaseModel

def what_time(tz: Optional[str] = None):
    '''Current time, defaulting to UTC'''
    if tz is None:
        pass
    elif tz in all_timezones:
        tz = timezone(tz)
    else:
        return 'Invalid timezone'

    return datetime.now(tz).strftime('%I:%M %p')

class WhatTime(BaseModel):
    tz: Optional[str] = None

Let's break this down.

what_time is the function we're going to provide access to. Its docstring forms the description for the model while the schema comes from the pydantic BaseModel called WhatTime.

import chatlab

chat = chatlab.Chat()

# Register our function
chat.register(what_time, WhatTime)

After that, we can call chat with direct strings (which are turned into user messages) or using simple message makers from chatlab named user and system.

await chat("What time is it?")
 𝑓  Ran `what_time`

Input:

{}

Output:

"11:19 AM"
The current time is 11:19 AM.

Interface

The chatlab package exports

Chat

The Chat class is the main way to chat using OpenAI's models. It keeps a history of your chat in Chat.messages.

Chat.submit

submit is how you send all the currently built up messages over to OpenAI. Markdown output will display responses from the assistant.

await chat.submit('What would a parent who says "I have to play zone defense" mean? ')
# Markdown response inline
chat.messages
[{'role': 'user',
  'content': 'What does a parent of three kids mean by "I have to play zone defense"?'},
 {'role': 'assistant',
  'content': 'When a parent of three kids says "I have to play zone defense," it means that they...

Chat.register

You can register functions with Chat.register to make them available to the chat model. The function's docstring becomes the description of the function while the schema is derived from the pydantic.BaseModel passed in.

from pydantic import BaseModel

class WhatTime(BaseModel):
    tz: Optional[str] = None

def what_time(tz: Optional[str] = None):
    '''Current time, defaulting to UTC'''
    if tz is None:
        pass
    elif tz in all_timezones:
        tz = timezone(tz)
    else:
        return 'Invalid timezone'

    return datetime.now(tz).strftime('%I:%M %p')

chat.register(what_time, WhatTime)

Chat.messages

The raw messages sent and received to OpenAI. If you hit a token limit, you can remove old messages from the list to make room for more.

chat.messages = chat.messages[-100:]

Messaging

human/user

These functions create a message from the user to the chat model.

from chatlab import human

human("How are you?")
{ "role": "user", "content": "How are you?" }

narrate/system

system messages, also called narrate in chatlab, allow you to steer the model in a direction. You can use these to provide context without being seen by the user. One common use is to include it as initial context for the conversation.

from chatlab import narrate

narrate("You are a large bird")
{ "role": "system", "content": "You are a large bird" }

Development

This project uses poetry for dependency management. To get started, clone the repo and run

poetry install -E dev -E test

We use ruff and mypy.

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chatlab-2.1.1.tar.gz (32.2 kB view details)

Uploaded Source

Built Distribution

chatlab-2.1.1-py3-none-any.whl (29.6 kB view details)

Uploaded Python 3

File details

Details for the file chatlab-2.1.1.tar.gz.

File metadata

  • Download URL: chatlab-2.1.1.tar.gz
  • Upload date:
  • Size: 32.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.8

File hashes

Hashes for chatlab-2.1.1.tar.gz
Algorithm Hash digest
SHA256 ff27f79eaf20c51a029d8674281e9b5f894a9f3216f329bf68c9ada3c18f69fb
MD5 50216c7555aaad1fdba36e9875a833da
BLAKE2b-256 d3b2d2708d9ce978522658539363f9630331ad7efe38740336816efff4949094

See more details on using hashes here.

File details

Details for the file chatlab-2.1.1-py3-none-any.whl.

File metadata

  • Download URL: chatlab-2.1.1-py3-none-any.whl
  • Upload date:
  • Size: 29.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.7.1 CPython/3.11.7 Darwin/23.3.0

File hashes

Hashes for chatlab-2.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f2ebd1fccff19f23b6bca7bfb299c571bf91dd644518447dd51ff05749f53f04
MD5 020a9639c69f6cf6732b4c06d6179b0e
BLAKE2b-256 1f98e187cb38e29cad84ae3dba388c6474a4bd8d715f26098bbd6c462033f20c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page