Skip to main content

Minimal IPython backtick-to-AI extension

Project description

ipyai

ipyai is an IPython extension that turns any input starting with ` into an AI prompt.

It is aimed at terminal IPython, not notebook frontends. Prompts stream through lisette, final output is rendered with rich, and prompt history is stored alongside normal IPython history in the same SQLite database.

When imported, ipyai also applies two small IPython compatibility fixes borrowed from ipykernel_helper for traceback and inspect.getfile edge cases.

Install

pip install ipyai

Load

%load_ext ipyai

If you change the package in a running shell:

%reload_ext ipyai

How To Auto-Load ipyai

ipyai is designed for terminal IPython. To auto-load it, add this to an ipython_config.py file used by terminal ipython:

c.TerminalIPythonApp.extensions = ["ipyai.core"]

Good places for that file include:

  • env-local: {sys.prefix}/etc/ipython/ipython_config.py
  • user-local: ~/.ipython/profile_default/ipython_config.py
  • system-wide IPython config directories

In a virtualenv, the env-local path is usually:

  • .venv/etc/ipython/ipython_config.py

To see which config paths your current ipython is searching, run:

ipython --debug -c 'exit()' 2>&1 | grep Searching

Usage

Only the leading backtick is special. There is no closing delimiter.

Single line:

`write a haiku about sqlite

Multiline paste:

`summarize this module:
focus on state management
and persistence behavior

Backslash-Enter continuation in the terminal:

`draft a migration plan \
with risks and rollback steps

ipyai also provides a line and cell magic named %ipyai / %%ipyai.

%ipyai commands

%ipyai
%ipyai model claude-sonnet-4-6
%ipyai think m
%ipyai search h
%ipyai code_theme monokai
%ipyai save
%ipyai reset

Behavior:

  • %ipyai prints the active model, think level, search level, code theme, logging flag, and the current config file paths
  • %ipyai model ..., %ipyai think ..., %ipyai search ..., %ipyai code_theme ... change the current session only
  • %ipyai save writes the current session's code and AI prompts to startup.json
  • %ipyai reset deletes AI prompt history for the current IPython session and resets the code-context baseline

Tools

To expose a function from the active IPython namespace as a tool for the current conversation, reference it as &\name`` in the prompt:

def weather(city): return f"Sunny in {city}"

`use &`weather` to answer the question about Brisbane

The tool name exposed to the model is the namespace name you referenced, so callable objects bound in user_ns also work as expected. Async callables are also supported.

Output Rendering

Responses are streamed directly to the terminal during generation.

  • in a TTY, ipyai uses Rich Live(Markdown(...)) so the visible response is rendered as markdown while it streams
  • the stored response remains the original full lisette output
  • tool call detail blocks are compacted in the visible output to a short single-line form such as 🔧 f(x=1) => 2

Startup Replay

On first load, ipyai also creates ~/.config/ipyai/startup.json.

%ipyai save snapshots the current IPython session into that file:

  • normal code cells are saved as code events
  • AI prompts are saved as prompt/response events

When ipyai loads into a fresh terminal IPython session:

  • saved code events are replayed with run_cell(..., store_history=True)
  • saved prompt/response pairs are inserted into ai_prompts for the new session

This is intended for priming new sessions with imports, helper definitions, tools, and prior AI context without re-running the prompts themselves.

Configuration

At import time, ipyai defines these XDG-backed module path variables:

  • ~/.config/ipyai/config.json
  • ~/.config/ipyai/sysp.txt
  • ~/.config/ipyai/startup.json
  • ~/.config/ipyai/exact-log.jsonl

Those files are created on demand when ipyai first needs them.

config.json currently supports:

{
  "model": "claude-sonnet-4-6",
  "think": "l",
  "search": "l",
  "code_theme": "monokai",
  "log_exact": false
}

Notes:

  • model defaults from IPYAI_MODEL if that environment variable is set when the config file is first created
  • think and search must be one of l, m, or h
  • code_theme is passed to Rich for fenced and inline code styling
  • log_exact, when true, appends the exact full prompt sent to the model and the exact raw response returned by the model to ~/.config/ipyai/exact-log.jsonl

sysp.txt is used as the system prompt passed to lisette.AsyncChat.

Development

See DEV.md for project layout, architecture, persistence details, and development workflow.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ipyai-0.0.4.tar.gz (16.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ipyai-0.0.4-py3-none-any.whl (10.4 kB view details)

Uploaded Python 3

File details

Details for the file ipyai-0.0.4.tar.gz.

File metadata

  • Download URL: ipyai-0.0.4.tar.gz
  • Upload date:
  • Size: 16.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.0

File hashes

Hashes for ipyai-0.0.4.tar.gz
Algorithm Hash digest
SHA256 d6b23040e346c4d2d491b9ec3cee86bb1cd4bbaabd337f38090f903d59ed64fe
MD5 b19597ca9ef0aee9f10b189dc7685db7
BLAKE2b-256 4e02ceb611b71cfd5dcad18044b175c454c4034ecc9f2c07b10d2309290ac540

See more details on using hashes here.

File details

Details for the file ipyai-0.0.4-py3-none-any.whl.

File metadata

  • Download URL: ipyai-0.0.4-py3-none-any.whl
  • Upload date:
  • Size: 10.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.0

File hashes

Hashes for ipyai-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 96dd81639a77c01d26efe8684ae3757e691e78052bb987dcc5df58a296b15070
MD5 08cfb43ddb36d3beb8e91ae9cd211c8b
BLAKE2b-256 972cb825a5a77091816bf546050fbef4866b21c93b5483b475be3d1051dcf3a6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page