Skip to main content

Minimal IPython backtick-to-AI extension

Project description

ipyai

ipyai is an IPython extension that turns any input starting with ` into an AI prompt.

It is aimed at terminal IPython, not notebook frontends. Prompts stream through lisette, final output is rendered with rich, and prompt history is stored alongside normal IPython history in the same SQLite database.

Install

pip install ipyai

Load

%load_ext ipyai

If you change the package in a running shell:

%reload_ext ipyai

How To Auto-Load ipyai

ipyai is designed for terminal IPython. To auto-load it, add this to an ipython_config.py file used by terminal ipython:

c.TerminalIPythonApp.extensions = ["ipyai.core"]

Good places for that file include:

  • env-local: {sys.prefix}/etc/ipython/ipython_config.py
  • user-local: ~/.ipython/profile_default/ipython_config.py
  • system-wide IPython config directories

In a virtualenv, the env-local path is usually:

  • .venv/etc/ipython/ipython_config.py

To see which config paths your current ipython is searching, run:

ipython --debug -c 'exit()' 2>&1 | grep Searching

Usage

Only the leading backtick is special. There is no closing delimiter.

Single line:

`write a haiku about sqlite

Multiline paste:

`summarize this module:
focus on state management
and persistence behavior

Backslash-Enter continuation in the terminal:

`draft a migration plan \
with risks and rollback steps

ipyai also provides a line and cell magic named %ipyai / %%ipyai.

%ipyai commands

%ipyai
%ipyai model claude-sonnet-4-6
%ipyai think m
%ipyai search h
%ipyai code_theme monokai
%ipyai save
%ipyai reset

Behavior:

  • %ipyai prints the active model, think level, search level, code theme, logging flag, config path, system prompt path, startup path, and exact-log path
  • %ipyai model ..., %ipyai think ..., %ipyai search ..., %ipyai code_theme ... change the current session only
  • %ipyai save writes the current session's code and AI prompts to startup.json
  • %ipyai reset deletes AI prompt history for the current IPython session and resets the code-context baseline

Tools

To expose a function from the active IPython namespace as a tool for the current conversation, reference it as &\name`` in the prompt:

def weather(city): return f"Sunny in {city}"

`use &`weather` to answer the question about Brisbane

Output Rendering

Responses are streamed directly to the terminal during generation.

  • in a TTY, ipyai uses Rich Live(Markdown(...)) so the visible response is rendered as markdown while it streams
  • the stored response remains the original full lisette output
  • tool call detail blocks are compacted in the visible output to a short single-line form such as 🔧 f(x=1) => 2

Startup Replay

On first load, ipyai also creates ~/.config/ipyai/startup.json.

%ipyai save snapshots the current IPython session into that file:

  • normal code cells are saved as code events
  • AI prompts are saved as prompt/response events

When ipyai loads into a fresh terminal IPython session:

  • saved code events are replayed with run_cell(..., store_history=True)
  • saved prompt/response pairs are inserted into ai_prompts for the new session

This is intended for priming new sessions with imports, helper definitions, tools, and prior AI context without re-running the prompts themselves.

Configuration

On first load, ipyai creates these files under the XDG config directory:

  • ~/.config/ipyai/config.json
  • ~/.config/ipyai/sysp.txt
  • ~/.config/ipyai/startup.json

config.json currently supports:

{
  "model": "claude-sonnet-4-6",
  "think": "l",
  "search": "l",
  "code_theme": "monokai",
  "log_exact": false
}

Notes:

  • model defaults from IPYAI_MODEL if that environment variable is set when the config file is first created
  • think and search must be one of l, m, or h
  • code_theme is passed to Rich for fenced and inline code styling
  • log_exact, when true, appends the exact full prompt sent to the model and the exact raw response returned by the model to ~/.config/ipyai/exact-log.jsonl

sysp.txt is used as the system prompt passed to lisette.Chat.

Development

See DEV.md for project layout, architecture, persistence details, and development workflow.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ipyai-0.0.3.tar.gz (16.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ipyai-0.0.3-py3-none-any.whl (10.2 kB view details)

Uploaded Python 3

File details

Details for the file ipyai-0.0.3.tar.gz.

File metadata

  • Download URL: ipyai-0.0.3.tar.gz
  • Upload date:
  • Size: 16.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.0

File hashes

Hashes for ipyai-0.0.3.tar.gz
Algorithm Hash digest
SHA256 e9430ec0c2b1b9fac10c2fdec337e3daedaa56c4361204eaf232b17e81f10e92
MD5 a7e0a872878ff020c100925d7f714025
BLAKE2b-256 a0760951ab093fc68f317c2caabdfd1df12be3af477167da4cd6c31dc240bb6e

See more details on using hashes here.

File details

Details for the file ipyai-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: ipyai-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 10.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.0

File hashes

Hashes for ipyai-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 a701a3f5ae0b5dc15e6126d7a17535cccb8757e860232dd621f1fa30f58b6268
MD5 e2efbb22b50dbcfd864bab12aa8da792
BLAKE2b-256 d22a75999399b854ac2fd6359665eea6d8360031a3a4dedbc7caac9459a44986

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page