Skip to main content

Custom Jupyter magics for interacting with LLMs.

Project description

JupyterChatbook

Python package of a Jupyter extension that facilitates the interaction with Large Language Models (LLMs).

Remark: The chatbook LLM cells use the packages "openai", [OAIp2], and "google-generativeai", [GAIp1].

Remark: The API keys for the LLM cells are taken from the Operating System (OS) environmental variables OPENAI_API_KEY and PALM_API_KEY.

Remark: The results of the LLM cells are automatically copied to the clipboard using the package "pyperclip", [ASp1].


Installation

Install from GitHub

pip install -e git+https://github.com/antononcube/Python-JupyterChatbook.git#egg=Python-JupyterChatbook

From PyPi

pip install JupyterChatbook

Demonstration notebooks (chatbooks)

Notebook Description
Chatbooks-cells-demo.ipynb How to do multi-cell (notebook-wide) chats?
Chatbook-LLM-cells.ipynb How to "directly message" LLMs services?
DALL-E-cells-demo.ipynb How to generate images with DALL-E?
Echoed-chats.ipynb How to see the LLM interaction execution steps?

Notebook-wide chats

Chatbooks have the ability to maintain LLM conversations over multiple notebook cells. A chatbook can have more than one LLM conversations. "Under the hood" each chatbook maintains a database of chat objects. Chat cells are used to give messages to those chat objects.

For example, here is a chat cell with which a new "Email writer" chat object is made, and that new chat object has the identifier "em12":

%%chat --chat_id em12, --prompt "Given a topic, write emails in a concise, professional manner"
Write a vacation email.

Here is a chat cell in which another message is given to the chat object with identifier "em12":

%%chat --chat_id em12
Rewrite with manager's name being Jane Doe, and start- and end dates being 8/20 and 9/5.

In this chat cell a new chat object is created:

%%chat -i snowman, --prompt "Pretend you are a friendly snowman. Stay in character for every response you give me. Keep your responses short."
Hi!

And here is a chat cell that sends another message to the "snowman" chat object:

%%chat -i snowman
Who build you? Where?

Remark: Specifying a chat object identifier is not required. I.e. only the magic spec %%chat can be used. The "default" chat object ID identifier is "NONE".

For more examples see the notebook "Chatbook-cells-demo.ipynb".

Here is a flowchart that summarizes the way chatbooks create and utilize LLM chat objects:

flowchart LR
    OpenAI{{OpenAI}}
    PaLM{{PaLM}}
    LLMFunc[[LLMFunctions]]
    LLMProm[[LLMPrompts]]
    CODB[(Chat objects)]
    PDB[(Prompts)]
    CCell[/Chat cell/]
    CRCell[/Chat result cell/]
    CIDQ{Chat ID<br/>specified?}
    CIDEQ{Chat ID<br/>exists in DB?}
    RECO[Retrieve existing<br/>chat object]
    COEval[Message<br/>evaluation]
    PromParse[Prompt<br/>DSL spec parsing]
    KPFQ{Known<br/>prompts<br/>found?}
    PromExp[Prompt<br/>expansion]
    CNCO[Create new<br/>chat object]
    CIDNone["Assume chat ID<br/>is 'NONE'"] 
    subgraph Chatbook frontend    
        CCell
        CRCell
    end
    subgraph Chatbook backend
        CIDQ
        CIDEQ
        CIDNone
        RECO
        CNCO
        CODB
    end
    subgraph Prompt processing
        PDB
        LLMProm
        PromParse
        KPFQ
        PromExp 
    end
    subgraph LLM interaction
      COEval
      LLMFunc
      PaLM
      OpenAI
    end
    CCell --> CIDQ
    CIDQ --> |yes| CIDEQ
    CIDEQ --> |yes| RECO
    RECO --> PromParse
    COEval --> CRCell
    CIDEQ -.- CODB
    CIDEQ --> |no| CNCO
    LLMFunc -.- CNCO -.- CODB
    CNCO --> PromParse --> KPFQ
    KPFQ --> |yes| PromExp
    KPFQ --> |no| COEval
    PromParse -.- LLMProm 
    PromExp -.- LLMProm
    PromExp --> COEval 
    LLMProm -.- PDB
    CIDQ --> |no| CIDNone
    CIDNone --> CIDEQ
    COEval -.- LLMFunc
    LLMFunc <-.-> OpenAI
    LLMFunc <-.-> PaLM

Chat meta cells

TBD...


DALL-E access

See the notebook "DALL-E-cells-demo.ipynb"

Here is a screenshot:


Implementation details

The design of this package -- and corresponding envisioned workflows with it -- follow those of the Raku package "Jupyter::Chatbook", [AAp3].


TODO

  • TODO Implementation
    • DONE PalM chat cell
    • TODO Using "pyperclip"
      • DONE Basic
        • %%chatgpt
        • %%dalle
        • %%palm
        • %%chat
      • TODO Switching on/off copying to the clipboard
        • DONE Per cell
          • With the argument --copy_to_clipboard.
        • TODO Global
          • Can be done via the chat meta cell, but maybe a more elegant, bureaucratic solution exists.
    • DONE DALL-E image variations cell
      • Combined image variations and edits with %%dalle.
    • TODO Mermaid-JS cell
    • TODO ProdGDT cell
    • MAYBE DeepL cell
  • TODO Documentation
    • TODO Multi-cell LLM chats movie (teaser)
    • TODO Multi-cell LLM chats movie (comprehensive)
    • TODO LLM service cells movie (short)
    • TODO Code generation

References

Packages

[AAp1] Anton Antonov, LLMFunctionObjects Python package, (2023), Python-packages at GitHub/antononcube.

[AAp2] Anton Antonov, LLMPrompts Python package, (2023), Python-packages at GitHub/antononcube.

[AAp3] Anton Antonov, Jupyter::Chatbook Raku package, (2023), GitHub/antononcube.

[ASp1] Al Sweigart, pyperclip (Python package), (2013-2021), PyPI.org/AlSweigart.

[GAIp1] Google AI, google-generativeai (Google Generative AI Python Client), (2023), PyPI.org/google-ai.

[OAIp1] OpenAI, openai (OpenAI Python Library), (2020-2023), PyPI.org.

Videos

[AAv1] Anton Antonov, "Jupyter Chatbook multi cell LLM chats teaser (Raku)", (2023), YouTube/@AAA4Prediction.

[AAv2] Anton Antonov, "Jupyter Chatbook multi cell LLM chats teaser (Python)", (2023), YouTube/@AAA4Prediction.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

JupyterChatbook-0.0.9.tar.gz (9.1 kB view details)

Uploaded Source

Built Distribution

JupyterChatbook-0.0.9-py3-none-any.whl (9.4 kB view details)

Uploaded Python 3

File details

Details for the file JupyterChatbook-0.0.9.tar.gz.

File metadata

  • Download URL: JupyterChatbook-0.0.9.tar.gz
  • Upload date:
  • Size: 9.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.8

File hashes

Hashes for JupyterChatbook-0.0.9.tar.gz
Algorithm Hash digest
SHA256 4616e1985743cce60f42f60785acda74227e2c18a13ac19e360382d518ce018d
MD5 5b0ee4c8b7356d710e8a081e371f653c
BLAKE2b-256 298522d2d6ee1a7f0694a1683c80731fce0280290158546e27d96b743820310f

See more details on using hashes here.

File details

Details for the file JupyterChatbook-0.0.9-py3-none-any.whl.

File metadata

File hashes

Hashes for JupyterChatbook-0.0.9-py3-none-any.whl
Algorithm Hash digest
SHA256 b5afbb3ed866a2c15c59204aa50d70b9e56658c9bf20c4ff08a6bc271b74e6a2
MD5 0949448d1bc36444fb60917b52128d2d
BLAKE2b-256 45ef5d16d78ab4bfd9e4f622a1906b48a27c223ff4251c0bbd3c0c251c1f7a80

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page