Custom Jupyter magics for interacting with LLMs.
Project description
JupyterChatbook
Python package of a Jupyter extension that facilitates the interaction with Large Language Models (LLMs).
Remark: The chatbook LLM cells use the packages "openai", [OAIp2], and "google-generativeai", [GAIp1].
Remark: The API keys for the LLM cells are taken from the Operating System (OS) environmental variables OPENAI_API_KEY
and PALM_API_KEY
.
Remark: The results of the LLM cells are automatically copied to the clipboard using the package "pyperclip", [ASp1].
Installation
Install from GitHub
pip install -e git+https://github.com/antononcube/Python-JupyterChatbook.git#egg=Python-JupyterChatbook
From PyPi
pip install JupyterChatbook
Demonstration notebooks (chatbooks)
Notebook | Description |
---|---|
Chatbooks-cells-demo.ipynb | How to do multi-cell (notebook-wide) chats? |
Chatbook-LLM-cells.ipynb | How to "directly message" LLMs services? |
DALL-E-cells-demo.ipynb | How to generate images with DALL-E? |
Echoed-chats.ipynb | How to see the LLM interaction execution steps? |
Notebook-wide chats
Chatbooks have the ability to maintain LLM conversations over multiple notebook cells. A chatbook can have more than one LLM conversations. "Under the hood" each chatbook maintains a database of chat objects. Chat cells are used to give messages to those chat objects.
For example, here is a chat cell with which a new "Email writer" chat object is made, and that new chat object has the identifier "em12":
%%chat --chat_id em12, --prompt "Given a topic, write emails in a concise, professional manner"
Write a vacation email.
Here is a chat cell in which another message is given to the chat object with identifier "em12":
%%chat --chat_id em12
Rewrite with manager's name being Jane Doe, and start- and end dates being 8/20 and 9/5.
In this chat cell a new chat object is created:
%%chat -i snowman, --prompt "Pretend you are a friendly snowman. Stay in character for every response you give me. Keep your responses short."
Hi!
And here is a chat cell that sends another message to the "snowman" chat object:
%%chat -i snowman
Who build you? Where?
Remark: Specifying a chat object identifier is not required. I.e. only the magic spec %%chat
can be used.
The "default" chat object ID identifier is "NONE".
For more examples see the notebook "Chatbook-cells-demo.ipynb".
Here is a flowchart that summarizes the way chatbooks create and utilize LLM chat objects:
flowchart LR
OpenAI{{OpenAI}}
PaLM{{PaLM}}
LLMFunc[[LLMFunctions]]
LLMProm[[LLMPrompts]]
CODB[(Chat objects)]
PDB[(Prompts)]
CCell[/Chat cell/]
CRCell[/Chat result cell/]
CIDQ{Chat ID<br/>specified?}
CIDEQ{Chat ID<br/>exists in DB?}
RECO[Retrieve existing<br/>chat object]
COEval[Message<br/>evaluation]
PromParse[Prompt<br/>DSL spec parsing]
KPFQ{Known<br/>prompts<br/>found?}
PromExp[Prompt<br/>expansion]
CNCO[Create new<br/>chat object]
CIDNone["Assume chat ID<br/>is 'NONE'"]
subgraph Chatbook frontend
CCell
CRCell
end
subgraph Chatbook backend
CIDQ
CIDEQ
CIDNone
RECO
CNCO
CODB
end
subgraph Prompt processing
PDB
LLMProm
PromParse
KPFQ
PromExp
end
subgraph LLM interaction
COEval
LLMFunc
PaLM
OpenAI
end
CCell --> CIDQ
CIDQ --> |yes| CIDEQ
CIDEQ --> |yes| RECO
RECO --> PromParse
COEval --> CRCell
CIDEQ -.- CODB
CIDEQ --> |no| CNCO
LLMFunc -.- CNCO -.- CODB
CNCO --> PromParse --> KPFQ
KPFQ --> |yes| PromExp
KPFQ --> |no| COEval
PromParse -.- LLMProm
PromExp -.- LLMProm
PromExp --> COEval
LLMProm -.- PDB
CIDQ --> |no| CIDNone
CIDNone --> CIDEQ
COEval -.- LLMFunc
LLMFunc <-.-> OpenAI
LLMFunc <-.-> PaLM
Chat meta cells
TBD...
DALL-E access
See the notebook "DALL-E-cells-demo.ipynb"
Here is a screenshot:
Implementation details
The design of this package -- and corresponding envisioned workflows with it -- follow those of the Raku package "Jupyter::Chatbook", [AAp3].
TODO
- TODO Implementation
- DONE PalM chat cell
- TODO Using "pyperclip"
- DONE Basic
-
%%chatgpt
-
%%dalle
-
%%palm
-
%%chat
-
- TODO Switching on/off copying to the clipboard
- DONE Per cell
- With the argument
--copy_to_clipboard
.
- With the argument
- TODO Global
- Can be done via the chat meta cell, but maybe a more elegant, bureaucratic solution exists.
- DONE Per cell
- DONE Basic
- DONE Formatted output: asis, html, markdown
- General lexer code?
- Includes LaTeX.
-
%%chatgpt
-
%%palm
-
%%chat
-
%%chat_meta
?
- General lexer code?
- DONE DALL-E image variations cell
- Combined image variations and edits with
%%dalle
.
- Combined image variations and edits with
- TODO Mermaid-JS cell
- TODO ProdGDT cell
- MAYBE DeepL cell
- See "deepl-python"
- TODO Documentation
- DONE Multi-cell LLM chats movie (teaser)
- See [AAv2].
- TODO LLM service cells movie (short)
- TODO Multi-cell LLM chats movie (comprehensive)
- TODO Code generation
- DONE Multi-cell LLM chats movie (teaser)
References
Packages
[AAp1] Anton Antonov, LLMFunctionObjects Python package, (2023), Python-packages at GitHub/antononcube.
[AAp2] Anton Antonov, LLMPrompts Python package, (2023), Python-packages at GitHub/antononcube.
[AAp3] Anton Antonov, Jupyter::Chatbook Raku package, (2023), GitHub/antononcube.
[ASp1] Al Sweigart, pyperclip (Python package), (2013-2021), PyPI.org/AlSweigart.
[GAIp1] Google AI, google-generativeai (Google Generative AI Python Client), (2023), PyPI.org/google-ai.
[OAIp1] OpenAI, openai (OpenAI Python Library), (2020-2023), PyPI.org.
Videos
[AAv1] Anton Antonov, "Jupyter Chatbook multi cell LLM chats teaser (Raku)", (2023), YouTube/@AAA4Prediction.
[AAv2] Anton Antonov, "Jupyter Chatbook multi cell LLM chats teaser (Python)", (2023), YouTube/@AAA4Prediction.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file JupyterChatbook-0.0.12.tar.gz
.
File metadata
- Download URL: JupyterChatbook-0.0.12.tar.gz
- Upload date:
- Size: 9.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 411f68cc9d6d8b889b723449a0813af77849e07e4163fcd0c15640f346bf2a42 |
|
MD5 | 0e3404d8d4083743310dac00f4863a25 |
|
BLAKE2b-256 | c10a83f9384254af6f01405b132aa25688ead432dbe8382f85c78b70eb650bb2 |
File details
Details for the file JupyterChatbook-0.0.12-py3-none-any.whl
.
File metadata
- Download URL: JupyterChatbook-0.0.12-py3-none-any.whl
- Upload date:
- Size: 9.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 355b606b4f0d79047f092252be9b064e57a5028c3036ac6e28f61f2ae87ca1fc |
|
MD5 | 37fe0b93d069b0f17bcabf63ad29b36f |
|
BLAKE2b-256 | 6aeced136673c01d8b4e650e2734d0e3a26f3f453a42e1eb9ccf3b8eec701d0d |