Skip to main content

Minimal IPython backtick-to-AI extension

Project description

ipyai

ipyai is an IPython extension that turns any input starting with ` into an AI prompt.

It is aimed at terminal IPython, not notebook frontends. Prompts stream through lisette, final output is rendered with rich, and prompt history is stored alongside normal IPython history in the same SQLite database.

When imported, ipyai also applies two small IPython compatibility fixes borrowed from ipykernel_helper for traceback and inspect.getfile edge cases.

Install

pip install ipyai

Load

%load_ext ipyai

If you change the package in a running shell:

%reload_ext ipyai

How To Auto-Load ipyai

ipyai is designed for terminal IPython. To auto-load it, add this to an ipython_config.py file used by terminal ipython:

c.TerminalIPythonApp.extensions = ["ipyai.core"]

Good places for that file include:

  • env-local: {sys.prefix}/etc/ipython/ipython_config.py
  • user-local: ~/.ipython/profile_default/ipython_config.py
  • system-wide IPython config directories

In a virtualenv, the env-local path is usually:

  • .venv/etc/ipython/ipython_config.py

To see which config paths your current ipython is searching, run:

ipython --debug -c 'exit()' 2>&1 | grep Searching

Usage

Only the leading backtick is special. There is no closing delimiter.

Single line:

`write a haiku about sqlite

Multiline paste:

`summarize this module:
focus on state management
and persistence behavior

Backslash-Enter continuation in the terminal:

`draft a migration plan \
with risks and rollback steps

ipyai also provides a line and cell magic named %ipyai / %%ipyai.

%ipyai commands

%ipyai
%ipyai model claude-sonnet-4-6
%ipyai think m
%ipyai search h
%ipyai code_theme monokai
%ipyai save
%ipyai reset

Behavior:

  • %ipyai prints the active model, think level, search level, code theme, logging flag, and the current config file paths
  • %ipyai model ..., %ipyai think ..., %ipyai search ..., %ipyai code_theme ... change the current session only
  • %ipyai save writes the current session's code and AI prompts to startup.json
  • %ipyai reset deletes AI prompt history for the current IPython session and resets the code-context baseline

Tools

To expose a function from the active IPython namespace as a tool for the current conversation, reference it as &\name`` in the prompt:

def weather(city): return f"Sunny in {city}"

`use &`weather` to answer the question about Brisbane

The tool name exposed to the model is the namespace name you referenced, so callable objects bound in user_ns also work as expected. Async callables are also supported.

Output Rendering

Responses are streamed directly to the terminal during generation.

  • in a TTY, ipyai uses Rich Live(Markdown(...)) so the visible response is rendered as markdown while it streams
  • the stored response remains the original full lisette output
  • tool call detail blocks are compacted in the visible output to a short single-line form such as 🔧 f(x=1) => 2
  • streamed AI responses are intentionally suppressed from IPython's normal output_history; ipyai stores them in ai_prompts instead

Startup Replay

On first load, ipyai also creates ~/.config/ipyai/startup.json.

%ipyai save snapshots the current IPython session into that file:

  • normal code cells are saved as code events
  • AI prompts are saved as prompt/response events

When ipyai loads into a fresh terminal IPython session:

  • saved code events are replayed with run_cell(..., store_history=True)
  • saved prompt/response pairs are inserted into ai_prompts for the new session

This is intended for priming new sessions with imports, helper definitions, tools, and prior AI context without re-running the prompts themselves.

Configuration

At import time, ipyai defines these XDG-backed module path variables:

  • ~/.config/ipyai/config.json
  • ~/.config/ipyai/sysp.txt
  • ~/.config/ipyai/startup.json
  • ~/.config/ipyai/exact-log.jsonl

Those files are created on demand when ipyai first needs them.

config.json currently supports:

{
  "model": "claude-sonnet-4-6",
  "think": "l",
  "search": "l",
  "code_theme": "monokai",
  "log_exact": false
}

Notes:

  • model defaults from IPYAI_MODEL if that environment variable is set when the config file is first created
  • think and search must be one of l, m, or h
  • code_theme is passed to Rich for fenced and inline code styling
  • log_exact, when true, appends the exact full prompt sent to the model and the exact raw response returned by the model to ~/.config/ipyai/exact-log.jsonl

sysp.txt is used as the system prompt passed to lisette.AsyncChat.

Development

See DEV.md for project layout, architecture, persistence details, and development workflow.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ipyai-0.0.5.tar.gz (17.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ipyai-0.0.5-py3-none-any.whl (10.6 kB view details)

Uploaded Python 3

File details

Details for the file ipyai-0.0.5.tar.gz.

File metadata

  • Download URL: ipyai-0.0.5.tar.gz
  • Upload date:
  • Size: 17.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.0

File hashes

Hashes for ipyai-0.0.5.tar.gz
Algorithm Hash digest
SHA256 bbfc31b005fdd9ef5696930393fc5664aa68ec8a113197c7138ed433ad4680db
MD5 92916482fa854c78fd8b35cd604b253f
BLAKE2b-256 b3f8a3014dbbf88cab5ca75b3e96ae0eddd60c3512c2524d0d33639f3b3ce964

See more details on using hashes here.

File details

Details for the file ipyai-0.0.5-py3-none-any.whl.

File metadata

  • Download URL: ipyai-0.0.5-py3-none-any.whl
  • Upload date:
  • Size: 10.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.0

File hashes

Hashes for ipyai-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 897177f0870f00f17f29403d78412c6de77d80b7417561adb63f3a6919e3b3f7
MD5 79a421311734d667eb3f7e8c054b2701
BLAKE2b-256 ac21311dc4c95f391db4adb70d47f62e4e8c913aec0961eca84a57e8b536f64c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page