LLM Magics for Ipython created by kmkolasinski
Project description
LLM Magics for Jupyter Notebooks
Interact seamlessly with Large Language Models (LLMs) like OpenAI's GPT models directly from your Jupyter Notebook using IPython magics.
https://pypi.org/project/llm-magics/
pip install llm-magics
Features
- Interactive Chat Interface: Engage in a conversational exchange with LLMs within your notebook cells.
- Customizable Models: Switch between different OpenAI chat models (e.g.,
gpt-3.5-turbo
,gpt-4
). - Set System Messages: Define system prompts to guide the behavior of the LLM.
- Persistent Chat History: Maintain context across multiple interactions.
- Rich Rendering: Receive responses with proper formatting, including syntax-highlighted code blocks.
- Easy Clearing: Reset the conversation when needed.
Installation
Prerequisites
- Python 3.10 or higher
- Jupyter Notebook or JupyterLab
- An OpenAI API key
Steps
-
Clone the Repository
git clone https://github.com/kmkolasinski/ipython-llm-magics cd llm-magics
-
Install Dependencies
Install the required Python packages:
pip install -r requirements.txt
-
Install the Package
pip install .
Configuration
Setting the OpenAI API Key
Before using the magics, ensure your OpenAI API key is accessible to the application.
Option 1: Environment Variable
Set the OPENAI_API_KEY
environment variable in your shell:
export OPENAI_API_KEY='your-api-key-here'
Option 2: Within the Notebook
Alternatively, set the API key within your notebook:
import os
os.environ['OPENAI_API_KEY'] = 'your-api-key-here'
Usage
Loading the Extension
In your Jupyter Notebook, load the llm_magics
extension:
%load_ext llm_magics
Setting the Model
Specify the OpenAI model you want to use:
%llm_set_model gpt-4o
Available models include gpt-3.5-turbo
, gpt-4o
, etc.
Setting a System Message (Optional)
Define a system message to guide the assistant's behavior:
%llm_set_system_message "You are a helpful assistant."
Starting a Conversation
Use the %%llm_chat
cell magic to send a message to the LLM:
%%llm_chat
Write a Python function that generates a random integer between 1 and 100.
Response:
The assistant will provide a Python function as per your request.
Continuing the Conversation
Maintain context across multiple %%llm_chat
cells:
%%llm_chat
Now modify the function to generate a random floating-point number between 0 and 1.
Response:
The assistant will update the function accordingly.
Inserting local variables into the chat
not_sorted_list = [3, 1, 4, 1, 5, 9, 2, 6, 5, 3, 5]
expected_output = sorted(not_sorted_list)
%%llm_chat
Write python code to sort the list in the descending order using merge sort algorithm.
So that I can write:
input = $not_sorted_list
assert $expected_output == my_merge_sort(input)
Clearing the Conversation History
Reset the chat history when needed:
%llm_clear
Rendering and Syntax Highlighting
The extension includes rich rendering of the LLM's responses:
- Code blocks are syntax-highlighted using Prism.js and Highlight.js.
- Copy buttons are added to code blocks for convenience.
- Markdown formatting is preserved in the responses.
History
- v1.1.0: Initial release with basic chat functionality and rendering.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file llm_magics-1.2.1.tar.gz
.
File metadata
- Download URL: llm_magics-1.2.1.tar.gz
- Upload date:
- Size: 7.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a26104b180a937f4c06e4fbf4df80434faea05b5365d82d1f7d413fafd6cdd38 |
|
MD5 | 0bbb152c7648e37f6322dccf71480a82 |
|
BLAKE2b-256 | 3928c6d6d31aeda9e9f0667f8b65f38cf96ef317ab77bfef1aa5a9887f12622f |
File details
Details for the file llm_magics-1.2.1-py3-none-any.whl
.
File metadata
- Download URL: llm_magics-1.2.1-py3-none-any.whl
- Upload date:
- Size: 7.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 07edde690d82422f17e321876eb644c95ae1c9b9148184d24e2e44fc4f8e9455 |
|
MD5 | c511b4b344ca1eb5341fb931e5630cf7 |
|
BLAKE2b-256 | ab1b69a4dfe59f5bb2b9426575d3902a696accb29bab68a62a96f77078728782 |