LLM Magics for Ipython created by kmkolasinski
Project description
LLM Magics for Jupyter Notebooks
Interact seamlessly with Large Language Models (LLMs) like OpenAI's GPT models directly from your Jupyter Notebook using IPython magics.
https://pypi.org/project/llm-magics/
pip install llm-magics
Features
- Interactive Chat Interface: Engage in a conversational exchange with LLMs within your notebook cells.
- Customizable Models: Switch between different OpenAI chat models (e.g.,
gpt-3.5-turbo
,gpt-4
). - Set System Messages: Define system prompts to guide the behavior of the LLM.
- Persistent Chat History: Maintain context across multiple interactions.
- Rich Rendering: Receive responses with proper formatting, including syntax-highlighted code blocks.
- Easy Clearing: Reset the conversation when needed.
Installation
Prerequisites
- Python 3.10 or higher
- Jupyter Notebook or JupyterLab
- An OpenAI API key
Steps
-
Clone the Repository
git clone https://github.com/kmkolasinski/ipython-llm-magics cd llm-magics
-
Install Dependencies
Install the required Python packages:
pip install -r requirements.txt
-
Install the Package
pip install .
Configuration
Setting the OpenAI API Key
Before using the magics, ensure your OpenAI API key is accessible to the application.
Option 1: Environment Variable
Set the OPENAI_API_KEY
environment variable in your shell:
export OPENAI_API_KEY='your-api-key-here'
Option 2: Within the Notebook
Alternatively, set the API key within your notebook:
import os
os.environ['OPENAI_API_KEY'] = 'your-api-key-here'
Usage
Loading the Extension
In your Jupyter Notebook, load the llm_magics
extension:
%load_ext llm_magics
Setting the Model
Specify the OpenAI model you want to use:
%llm_set_model gpt-4o
Available models include gpt-3.5-turbo
, gpt-4o
, etc.
Setting a System Message (Optional)
Define a system message to guide the assistant's behavior:
%llm_set_system_message "You are a helpful assistant."
Starting a Conversation
Use the %%llm_chat
cell magic to send a message to the LLM:
%%llm_chat
Write a Python function that generates a random integer between 1 and 100.
Response:
The assistant will provide a Python function as per your request.
Continuing the Conversation
Maintain context across multiple %%llm_chat
cells:
%%llm_chat
Now modify the function to generate a random floating-point number between 0 and 1.
Response:
The assistant will update the function accordingly.
Inserting local variables into the chat
not_sorted_list = [3, 1, 4, 1, 5, 9, 2, 6, 5, 3, 5]
expected_output = sorted(not_sorted_list)
%%llm_chat
Write python code to sort the list in the descending order using merge sort algorithm.
So that I can write:
input = $not_sorted_list
assert $expected_output == my_merge_sort(input)
Clearing the Conversation History
Reset the chat history when needed:
%llm_clear
Rendering and Syntax Highlighting
The extension includes rich rendering of the LLM's responses:
- Code blocks are syntax-highlighted using Prism.js and Highlight.js.
- Copy buttons are added to code blocks for convenience.
- Markdown formatting is preserved in the responses.
History
- v1.1.0: Initial release with basic chat functionality and rendering.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file llm_magics-1.2.0.tar.gz
.
File metadata
- Download URL: llm_magics-1.2.0.tar.gz
- Upload date:
- Size: 6.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f79084d1c8f55140038a635f782e958c4fd2640dfef4d47ef2c76fafdb1dc715 |
|
MD5 | 6ca8539a880df8ac80087e701431f9ae |
|
BLAKE2b-256 | 500c650c42ee0db0f63786ed26d9dd8947b1b3c9de6016a3061f9f34985f9ec8 |
File details
Details for the file llm_magics-1.2.0-py3-none-any.whl
.
File metadata
- Download URL: llm_magics-1.2.0-py3-none-any.whl
- Upload date:
- Size: 7.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1c189fa683e0f92355ac4afcb215a256fdb790cf58fd6262ee4162cebe815285 |
|
MD5 | 188ad1d02e66f1253797d0193566f8e6 |
|
BLAKE2b-256 | 4f66fe2d0619d59f8541ba04d7f45c0af953b2f7fd7b6dc8ea69bf92c0d4aece |