Multi-backend terminal IPython assistant
Project description
ipyai
ipyai is a terminal IPython extension with four AI backends:
- Codex API (
codex-api, default) — hits the Codexresponsesendpoint directly using your~/.codex/auth.jsontoken - Codex (
codex) — local Codex app-server - Claude CLI (
claude-cli) — drives theclaude -pCLI, so usage counts against your Claude subscription rather than API billing - Claude API (
claude-api)
It is aimed at terminal IPython, not notebook frontends.
Install
pip install -e ipyai
ipyai uses safepyrun for live Python state. Backend requirements:
codex-api: localcodexlogin so~/.codex/auth.jsonholds a valid access tokencodex: local Codex app-server accessclaude-cli: localclaudeCLI install and Claude Code login (subscription auth)claude-api: Anthropic API access
How to Use Prompts
There are several ways to send an AI prompt from ipyai:
Dot prefix (.) — In normal IPython mode, start any line with . to send it as a prompt. Everything after the dot is sent to the selected backend. Continuation lines (without a dot) are included too, so you can write multi-line prompts:
.explain what this dataframe transform is doing
.draft a plan for this notebook:
focus on state management
and failure cases
Prompt mode — When prompt mode is on, every line you type is sent as an AI prompt by default. To run normal Python code instead, prefix the line with ;. Shell commands (!) and magics (%) still work as usual. There are three ways to enable prompt mode:
opt-p(Alt-p) — toggle prompt mode on/off at any time from the terminal-pflag — start ipyai in prompt mode:ipyai -pprompt_modeconfig — set"prompt_mode": trueinconfig.jsonto always start in prompt mode
You can also toggle prompt mode during a session with %ipyai prompt.
CLI
ipyai
Flags:
ipyai -r # resume last session for the selected backend
ipyai -r 43 # resume session 43
ipyai -l session.ipynb # load a saved notebook session at startup
ipyai -b claude-cli # select backend: codex-api | codex | claude-cli | claude-api
ipyai -p # start in prompt mode
On exit, ipyai prints the session ID so you can resume later.
Load As Extension
%load_ext ipyai
Usage
ipyai is a normal IPython session — you can run Python code exactly as you would in plain IPython. On top of that, you can send prompts to the selected AI backend as described above. %ipyai / %%ipyai magics are also available.
Useful commands:
%ipyai
%ipyai model sonnet
%ipyai completion_model haiku
%ipyai think m
%ipyai code_theme monokai
%ipyai log_exact true
%ipyai prompt
%ipyai save mysession
%ipyai load mysession
%ipyai sessions
%ipyai reset
Context Model
For each AI prompt, ipyai sends:
- recent IPython code as
<code> - string-literal note cells as
<note> - recent outputs as
<output> - the current request as
<user-request> - referenced live variables as
<variable> - referenced shell command output as
<shell>
Prompt history is stored in SQLite; for compatibility, the table is currently named claude_prompts. Session metadata is stored in IPython's sessions.remark JSON, including cwd, backend, and provider_session_id.
Tools
ipyai exposes the same custom tools across all backends:
pyrun: run Python in the live IPython namespacebash: run an allowed shell command viasafecmdstart_bgterm: start a persistent shell sessionwrite_stdin: send input to a persistent shell session and read outputclose_bgterm: close a persistent shell sessionlnhashview_file: view hash-addressed file lines for verified editsexhash_file: apply verified hash-addressed edits to a file
When using the Claude CLI backend, it also enables these built-in Claude Code tools:
BashEditReadSkillWebFetchWebSearchWrite
Custom tools are exposed to claude -p through an in-process unix-socket MCP server plus a small stdio bridge subprocess (ipyai-mcp-bridge), so the subscription-driven CLI can still call live-kernel tools like pyrun.
The ipyai CLI loads safepyrun before ipyai, so pyrun is available by default in normal terminal use.
bash, start_bgterm, write_stdin, close_bgterm, lnhashview_file, and exhash_file are seeded into the user namespace by ipyai.
Skills
When using the Claude CLI backend, ipyai enables the built-in Skill tool and passes --setting-sources user,project so claude -p loads your normal user- and project-level skills.
Notebook Save/Load
%ipyai save <filename> writes a notebook snapshot. It stores:
- code cells
- note cells
- AI responses as markdown cells
- prompt metadata including both
promptandfull_prompt
%ipyai load <filename> restores that notebook into a fresh session.
ipyai -l <filename> does the same during startup.
Backend restore is backend-specific:
claude-cli: synthesizes a fresh Claude transcript JSONL each turn andclaude -p --resumes from itclaude-api: reuses the saved local prompt history directly on each turncodex-api: reuses the saved local prompt history directly on each turn (same flat-history flow asclaude-api)codex: starts a fresh thread and sends the loaded notebook as XML context once
Keyboard Shortcuts
Alt-.: AI inline completionAlt-p: toggle prompt modeAlt-Up/Down: history navigationAlt-Shift-W: paste all Python code blocks from the last responseAlt-Shift-1throughAlt-Shift-9: paste the Nth Python code blockAlt-Shift-Up/Down: cycle through extracted Python blocks
Config
Config lives under XDG_CONFIG_HOME/ipyai/:
config.jsonsysp.txtexact-log.jsonl
config.json supports:
{
"backend": "codex-api",
"models": {
"claude-cli": {"model": "sonnet", "completion_model": "haiku", "think": "m"},
"claude-api": {"model": "claude-sonnet-4-6", "completion_model": "claude-haiku-4-5-20251001","think": "m"},
"codex": {"model": "gpt-5.4", "completion_model": "gpt-5.4-mini", "think": "m"},
"codex-api": {"model": "gpt-5.4", "completion_model": "gpt-5.4-mini", "think": "m"}
},
"code_theme": "monokai",
"log_exact": false,
"prompt_mode": false
}
Development
See DEV.md.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ipyai-0.0.9.tar.gz.
File metadata
- Download URL: ipyai-0.0.9.tar.gz
- Upload date:
- Size: 37.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
97fe886cb6e4b14102b93194b1ce6e72792617a050f529109f9cdf25667174ea
|
|
| MD5 |
f013272a04d2bfc75259819c91de4088
|
|
| BLAKE2b-256 |
d044260498991889f4a716a267cfa42b0ced5a5903f4fb69c14a4ba5f0725395
|
File details
Details for the file ipyai-0.0.9-py3-none-any.whl.
File metadata
- Download URL: ipyai-0.0.9-py3-none-any.whl
- Upload date:
- Size: 33.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
d120c90457de0df59af91f47021684c6884048ee8c8fcc18afa044ef6a2db2b2
|
|
| MD5 |
ed04d25bd1c0425248cc2eb96ba71030
|
|
| BLAKE2b-256 |
1faa4aab03771b411b7f2e776ba1100e5f1cbecc269cabd4fcdfe480b3bb4ad6
|