Skip to main content

Terminal coding agent with planning, memory, web search, and workspace-scoped tools

Project description

Zain

A terminal coding agent with workspace memory, enforced planning, shell access, file search, and live docs lookup.

Built with LangChain, LangGraph, Azure OpenAI, Tavily, and uv.


What Zain Is

Zain is an interactive coding CLI that works inside a user-provided workspace. It can inspect files, search the codebase, run shell commands, plan multi-step work, and resume named conversations from SQLite-backed memory.

It is designed for coding tasks, not generic chat. The agent treats the chosen workspace as the project root and keeps its work scoped there.

Highlights

  • Workspace-rooted coding agent built with LangChain create_agent()
  • Azure OpenAI model integration through AzureChatOpenAI
  • Tavily-backed web_search tool for documentation and current references
  • Structured filesystem tools through FilesystemMiddleware
  • Shell execution through LangChain ShellToolMiddleware with host execution
  • File discovery through FilesystemFileSearchMiddleware
  • Explicit planning through TodoListMiddleware
  • .zain/PLAN.md plan mirroring and completion verification before final replies
  • Long-term conversation memory in .zain/memory.db
  • Automatic conversation summarization when context usage reaches 90%
  • Interactive CLI with named conversation resume support

Why It Feels Different

Zain is opinionated about agent discipline:

  • It plans before doing non-trivial coding work.
  • It mirrors active todos into a real file at .zain/PLAN.md.
  • It pushes the model to work through the plan one item at a time.
  • It checks the plan again before the final answer.
  • It deletes the plan file before replying once the work is complete.

That gives you a visible execution trail instead of an opaque tool loop.

Core Workflow

User request
  -> create/update todos
  -> write .zain/PLAN.md
  -> inspect/search/edit/run commands inside workspace
  -> keep todos in sync
  -> verify all plan items are completed
  -> delete .zain/PLAN.md
  -> answer user

Architecture

flowchart TD
    U[User in terminal] --> CLI[Typer CLI]
    CLI --> AGENT[LangChain create_agent]
    AGENT --> MODEL[Azure OpenAI]
    AGENT --> FS[Filesystem search middleware]
    AGENT --> SH[Shell tool middleware]
    AGENT --> TODO[Todo middleware]
    AGENT --> PLAN[Plan file middleware]
    AGENT --> WEB[Tavily web_search tool]
    AGENT --> SUM[Summarization middleware]
    AGENT --> MEM[SQLite memory via LangGraph checkpointer]
    MEM --> ZAINDIR[".zain/memory.db"]
    PLAN --> PLANFILE[".zain/PLAN.md"]

Feature Breakdown

1. Workspace Scoping

When you start Zain, you pass a working directory:

uv run zain /path/to/workspace

That directory becomes the workspace root. Zain treats it as the project root for:

  • file questions
  • code summaries
  • shell execution
  • repository inspection
  • project-level answers

2. Persistent Conversations

Zain stores named conversations in SQLite. You must explicitly start or resume a conversation before chatting:

/start api-refactor

If the name exists, Zain resumes it. Otherwise, it creates a new one.

3. Plan File Enforcement

For coding work, the agent uses todos and mirrors them into:

.zain/PLAN.md

That file acts as a visible checklist for the current task. The agent is instructed to:

  1. create todos first
  2. read PLAN.md
  3. execute work one step at a time
  4. update todos as progress changes
  5. re-read PLAN.md before the final answer
  6. delete PLAN.md
  7. reply only after cleanup

4. Web Search

Zain exposes a web_search tool backed by Tavily. The agent can use it to find current documentation, APIs, library references, and external technical context.

5. Long-Running Shell Work

Shell commands run on the host machine with the workspace as the working directory. The default shell command timeout is 60 minutes.

Tech Stack

Layer Choice
CLI Typer + Rich
Agent runtime LangChain + LangGraph
Filesystem tools deepagents FilesystemMiddleware
LLM Azure OpenAI
Web lookup Tavily
Memory SQLite via langgraph-checkpoint-sqlite
Package manager uv

Project Layout

.
├── pyproject.toml
├── README.md
├── src/
│   └── nirvana_coding_agent/
│       ├── __init__.py
│       ├── __main__.py
│       ├── agent.py
│       ├── cli.py
│       ├── config.py
│       ├── memory.py
│       ├── paths.py
│       └── planning.py
└── .env

Quick Start

Requirements

  • Python 3.11+
  • uv
  • Azure OpenAI credentials
  • Tavily API key

Install

uv sync

Minimal .env

AZURE_OPENAI_API_KEY="your-azure-openai-key"
AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/openai/responses?api-version=2025-04-01-preview"
AZURE_OPENAI_DEPLOYMENT="gpt-5.2"
TAVILY_API_KEY="your-tavily-key"

Run

uv run zain /path/to/workspace

Start a Conversation

/start first-session

Example Prompt

Scaffold a new FastAPI project with basic CRUD endpoints, use SQLite for the database, and manage dependencies with uv.

CLI Commands

Command Description
/start <name> Start or resume a named conversation
/conversation List saved conversation names
/todos Show the current todo list for the active conversation
/help Show available commands
/exit Exit the CLI

State Inside the Workspace

Zain writes runtime state into a hidden workspace folder:

.zain/
├── memory.db
├── memory.db-shm
├── memory.db-wal
└── PLAN.md

What These Files Do

  • memory.db: stores conversation history and checkpointed state
  • memory.db-shm / memory.db-wal: SQLite sidecar files
  • PLAN.md: temporary task plan file for the current coding task

If an older workspace still has .nirvana/memory.db, Zain copies that database into .zain/ on first startup for compatibility.

Configuration

Required Environment Variables

Variable Purpose
AZURE_OPENAI_API_KEY Azure OpenAI API key
AZURE_OPENAI_ENDPOINT Azure OpenAI endpoint or full Responses URL
AZURE_OPENAI_DEPLOYMENT Main deployment name
TAVILY_API_KEY Tavily API key for web search

Optional Environment Variables

Variable Default Purpose
AZURE_OPENAI_MODEL deployment name Override model label
AZURE_OPENAI_SUMMARY_DEPLOYMENT main deployment Separate deployment for summarization
AZURE_OPENAI_SUMMARY_MODEL model label Separate model label for summarization
OPENAI_OUTPUT_VERSION responses/v1 OpenAI Responses API output version
OPENAI_CONTEXT_WINDOW_TOKENS 400000 Input context window assumption
OPENAI_REQUEST_TIMEOUT_SECONDS 180 Model request timeout
NIRVANA_SUMMARY_TRIGGER_FRACTION 0.9 Summarize once context usage crosses this fraction
NIRVANA_SUMMARY_KEEP_MESSAGES 12 Messages preserved around summarization
NIRVANA_SHELL_TIMEOUT_SECONDS 3600 Shell command timeout
NIRVANA_SHELL_STARTUP_TIMEOUT_SECONDS 30 Shell startup timeout
NIRVANA_SHELL_TERMINATION_TIMEOUT_SECONDS 10 Shell termination timeout
NIRVANA_SHELL_MAX_OUTPUT_LINES 300 Shell output line cap
NIRVANA_SHELL_MAX_OUTPUT_BYTES unset Optional shell output byte cap
NIRVANA_SHELL_PROGRAM /bin/bash Shell binary
TAVILY_MAX_RESULTS 5 Tavily result count

Note: advanced env vars still use the legacy NIRVANA_ prefix for backward compatibility.

How Memory Works

Conversation state is persisted through a SQLite-backed LangGraph checkpointer. Each conversation uses a thread_id equal to the conversation name.

That gives Zain:

  • resumable conversations
  • persisted todos
  • durable agent state across CLI restarts

The implementation also filters transient LangGraph scheduler internals before writing checkpoints so tool-heavy runs can be serialized safely.

How Planning Works

Zain combines LangChain's TodoListMiddleware with a custom plan-file middleware.

The result is:

  • visible planning in the terminal
  • persisted todo state
  • a real plan file inside the workspace
  • a guard against premature final answers

It also combines two filesystem layers:

  • FilesystemMiddleware for direct file operations like ls, read_file, write_file, edit_file, glob, and grep
  • FilesystemFileSearchMiddleware for repository search-oriented discovery

This is especially useful for code generation, refactors, setup work, and multi-file changes.

UX Details

  • Zain uses Rich panels and tables for output.
  • If Markdown rendering fails, it falls back to plain text instead of crashing.
  • Typer pretty-exception rendering is disabled to keep the CLI stable in minimal environments.

Development

Install dependencies

uv sync

Run the CLI locally

uv run zain /path/to/workspace

Roadmap Ideas

  • safer write/edit primitives beyond raw shell usage
  • richer diff visualization in the CLI
  • test runner integration and patch summaries
  • optional approval policies for destructive commands
  • model/provider profiles beyond Azure OpenAI

Notes

  • The current shell execution policy is host-based, not Docker-isolated.
  • The workspace root is the only project scope the agent should assume.
  • Web search is available, but used only when the model decides it needs external information.

License

No license file is included yet. Add one before publishing if you want the repo to be clearly reusable.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zain-0.1.1.tar.gz (16.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

zain-0.1.1-py3-none-any.whl (20.7 kB view details)

Uploaded Python 3

File details

Details for the file zain-0.1.1.tar.gz.

File metadata

  • Download URL: zain-0.1.1.tar.gz
  • Upload date:
  • Size: 16.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for zain-0.1.1.tar.gz
Algorithm Hash digest
SHA256 31bfc2065e8e60b9129c21a7d778e06cecbada72a7771d03c5e7e5720b1b7a1e
MD5 4ca63dd6e251e8fa1a5f05436eb0112a
BLAKE2b-256 1882be122652f90e89b3a188154c231ccabad9c3fdff658664e13b75f57447af

See more details on using hashes here.

Provenance

The following attestation bundles were made for zain-0.1.1.tar.gz:

Publisher: zain.yaml on shaheerzaman/zain

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zain-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: zain-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 20.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for zain-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 3045ee117fa66151bd784a3f78a3563af7a58eb1667e021ec2be19ae8f43c269
MD5 3a3f359ffec0502c48b07a975d314d5b
BLAKE2b-256 ae1f0d85959733d23c47b5b01d3f4432b3872aec0e871814928f412071b37823

See more details on using hashes here.

Provenance

The following attestation bundles were made for zain-0.1.1-py3-none-any.whl:

Publisher: zain.yaml on shaheerzaman/zain

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page