Skip to main content

Custom Jupyter magics for interacting with LLMs.

Project description

JupyterChatbook

Python package of a Jupyter extension that facilitates the interaction with Large Language Models (LLMs).

Remark: The chatbook LLM cells use the packages "openai", [OAIp2], and "google-generativeai", [GAIp1].

Remark: The API keys for the LLM cells are taken from the Operating System (OS) environmental variables OPENAI_API_KEY and PALM_API_KEY.

Remark: The results of the LLM cells are automatically copied to the clipboard using the package "pyperclip", [ASp1].


Installation

Install from GitHub

pip install -e git+https://github.com/antononcube/Python-JupyterChatbook.git#egg=Python-JupyterChatbook

From PyPi

pip install JupyterChatbook

Demonstration notebooks (chatbooks)

Notebook Description
Chatbooks-cells-demo.ipynb How to do multi-cell (notebook-wide) chats?
Chatbook-LLM-cells.ipynb How to "directly message" LLMs services?
DALL-E-cells-demo.ipynb How to generate images with DALL-E?
Echoed-chats.ipynb How to see the LLM interaction execution steps?

Notebook-wide chats

Chatbooks have the ability to maintain LLM conversations over multiple notebook cells. A chatbook can have more than one LLM conversations. "Under the hood" each chatbook maintains a database of chat objects. Chat cells are used to give messages to those chat objects.

For example, here is a chat cell with which a new "Email writer" chat object is made, and that new chat object has the identifier "em12":

%%chat --chat_id em12, --prompt "Given a topic, write emails in a concise, professional manner"
Write a vacation email.

Here is a chat cell in which another message is given to the chat object with identifier "em12":

%%chat --chat_id em12
Rewrite with manager's name being Jane Doe, and start- and end dates being 8/20 and 9/5.

In this chat cell a new chat object is created:

%%chat -i snowman, --prompt "Pretend you are a friendly snowman. Stay in character for every response you give me. Keep your responses short."
Hi!

And here is a chat cell that sends another message to the "snowman" chat object:

%%chat -i snowman
Who build you? Where?

Remark: Specifying a chat object identifier is not required. I.e. only the magic spec %%chat can be used. The "default" chat object ID identifier is "NONE".

For more examples see the notebook "Chatbook-cells-demo.ipynb".

Here is a flowchart that summarizes the way chatbooks create and utilize LLM chat objects:

flowchart LR
    OpenAI{{OpenAI}}
    PaLM{{PaLM}}
    LLMFunc[[LLMFunctions]]
    LLMProm[[LLMPrompts]]
    CODB[(Chat objects)]
    PDB[(Prompts)]
    CCell[/Chat cell/]
    CRCell[/Chat result cell/]
    CIDQ{Chat ID<br/>specified?}
    CIDEQ{Chat ID<br/>exists in DB?}
    RECO[Retrieve existing<br/>chat object]
    COEval[Message<br/>evaluation]
    PromParse[Prompt<br/>DSL spec parsing]
    KPFQ{Known<br/>prompts<br/>found?}
    PromExp[Prompt<br/>expansion]
    CNCO[Create new<br/>chat object]
    CIDNone["Assume chat ID<br/>is 'NONE'"] 
    subgraph Chatbook frontend    
        CCell
        CRCell
    end
    subgraph Chatbook backend
        CIDQ
        CIDEQ
        CIDNone
        RECO
        CNCO
        CODB
    end
    subgraph Prompt processing
        PDB
        LLMProm
        PromParse
        KPFQ
        PromExp 
    end
    subgraph LLM interaction
      COEval
      LLMFunc
      PaLM
      OpenAI
    end
    CCell --> CIDQ
    CIDQ --> |yes| CIDEQ
    CIDEQ --> |yes| RECO
    RECO --> PromParse
    COEval --> CRCell
    CIDEQ -.- CODB
    CIDEQ --> |no| CNCO
    LLMFunc -.- CNCO -.- CODB
    CNCO --> PromParse --> KPFQ
    KPFQ --> |yes| PromExp
    KPFQ --> |no| COEval
    PromParse -.- LLMProm 
    PromExp -.- LLMProm
    PromExp --> COEval 
    LLMProm -.- PDB
    CIDQ --> |no| CIDNone
    CIDNone --> CIDEQ
    COEval -.- LLMFunc
    LLMFunc <-.-> OpenAI
    LLMFunc <-.-> PaLM

Chat meta cells

TBD...


DALL-E access

See the notebook "DALL-E-cells-demo.ipynb"

Here is a screenshot:


Implementation details

The design of this package -- and corresponding envisioned workflows with it -- follow those of the Raku package "Jupyter::Chatbook", [AAp3].


TODO

  • TODO Implementation
    • DONE PalM chat cell
    • TODO Using "pyperclip"
      • DONE Basic
        • %%chatgpt
        • %%dalle
        • %%palm
        • %%chat
      • TODO Switching on/off copying to the clipboard
        • DONE Per cell
          • With the argument --copy_to_clipboard.
        • TODO Global
          • Can be done via the chat meta cell, but maybe a more elegant, bureaucratic solution exists.
    • DONE DALL-E image variations cell
      • Combined image variations and edits with %%dalle.
    • TODO Mermaid-JS cell
    • TODO ProdGDT cell
    • MAYBE DeepL cell
  • TODO Documentation
    • TODO Multi-cell LLM chats movie (teaser)
    • TODO Multi-cell LLM chats movie (comprehensive)
    • TODO LLM service cells movie (short)
    • TODO Code generation

References

Packages

[AAp1] Anton Antonov, LLMFunctionObjects Python package, (2023), Python-packages at GitHub/antononcube.

[AAp2] Anton Antonov, LLMPrompts Python package, (2023), Python-packages at GitHub/antononcube.

[AAp3] Anton Antonov, Jupyter::Chatbook Raku package, (2023), GitHub/antononcube.

[ASp1] Al Sweigart, pyperclip (Python package), (2013-2021), PyPI.org/AlSweigart.

[GAIp1] Google AI, google-generativeai (Google Generative AI Python Client), (2023), PyPI.org/google-ai.

[OAIp1] OpenAI, openai (OpenAI Python Library), (2020-2023), PyPI.org.

Videos

[AAv1] Anton Antonov, "Jupyter Chatbook multi cell LLM chats teaser (Raku)", (2023), YouTube/@AAA4Prediction.

[AAv2] Anton Antonov, "Jupyter Chatbook multi cell LLM chats teaser (Python)", (2023), YouTube/@AAA4Prediction.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

JupyterChatbook-0.0.8.tar.gz (9.0 kB view details)

Uploaded Source

Built Distribution

JupyterChatbook-0.0.8-py3-none-any.whl (9.3 kB view details)

Uploaded Python 3

File details

Details for the file JupyterChatbook-0.0.8.tar.gz.

File metadata

  • Download URL: JupyterChatbook-0.0.8.tar.gz
  • Upload date:
  • Size: 9.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.8

File hashes

Hashes for JupyterChatbook-0.0.8.tar.gz
Algorithm Hash digest
SHA256 19e1cdefb5c2b14b2f96d5dbadd7812df6afb9d2319b92602934cd2304d8fad4
MD5 2d37abcc31be71621552f7bd80653f21
BLAKE2b-256 61e922b09bb7f47293f4e93ef3a49edae9d143fc58014625bd332c78f28b94cd

See more details on using hashes here.

File details

Details for the file JupyterChatbook-0.0.8-py3-none-any.whl.

File metadata

File hashes

Hashes for JupyterChatbook-0.0.8-py3-none-any.whl
Algorithm Hash digest
SHA256 47c9968adaf58cbf5d6f3a18f6071fba473e923a8dbdba386542a479c4b0930d
MD5 adde11af865b76cc479c95b9120b64b0
BLAKE2b-256 6aaf943b6ad0902c098c139334a7ca34bfcff1ea8c9df59707bceefc7ea14aa4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page