Upstream intelligence for AI coding agents โ keeps CLAUDE.md in sync across your entire repo graph
Project description
๐ง upstreamiq
Upstream intelligence for AI coding agents across multiple repos.
Claude Code / Cursor starts every session blind about your other repos. upstreamiq fixes that โ automatically, surgically, in 60 seconds.
pip install upstreamiq
๐ค The problem every multi-repo developer hits
You're working across 3 repos:
shared-types/ api-service/ frontend/
User GET /users <UserCard>
AuthToken POST /users useAuth()
ApiResponse DELETE /users fetchUser()
You open Claude Code in frontend. You ask it to update the user profile form.
Claude writes code using user.email โ which your api-service changed to user.emails[] three weeks ago.
It compiles. It crashes in production.
You spend 10 minutes re-explaining your API shape. Again. Every. Single. Session.
โ The fix โ upstreamiq
upstreamiq automatically extracts the public interface of your upstream repos โ types, endpoints, OpenAPI contracts โ and writes a surgical CLAUDE.upstream.md into each downstream repo.
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Your repo graph โ
โ โ
โ shared-types โโโบ api-service โโโบ frontend โ
โ โ โโโโบ mobile โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโบ frontend โ
โ โ
โ upstreamiq watches all of these. โ
โ When api-service changes โ frontend gets updated. โ
โ When shared-types changes โ everyone gets updated. โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Each downstream repo gets a CLAUDE.upstream.md โ automatically written, always fresh, always under 200 lines:
## api-service [calls_rest]
> Last synced: a3f9c2b ยท 2 hours ago
> โ ๏ธ BREAKING CHANGE: User.email โ User.emails[] (commit a3f9c2b: "refactor user model")
### Exported types
class User(BaseModel):
id: str
emails: list[str] # โ changed from: email: str
name: str
created_at: str
class CreateUserRequest(BaseModel):
email: str
name: str
### API endpoints
GET /api/users/{user_id} โ User
POST /api/users โ User
DELETE /api/users/{user_id} โ 204
Claude Code reads this at the start of every session. It already knows. You write correct code the first time.
๐ Quickstart โ 5 minutes to full setup
Step 1 โ Install
pip install upstreamiq
Step 2 โ Register your repos
upstreamiq add api-service ~/projects/api-service
upstreamiq add shared-types ~/projects/shared-types
upstreamiq add frontend ~/projects/frontend
upstreamiq add mobile ~/projects/mobile
Output:
โ Registered: api-service (python) at ~/projects/api-service
โ Registered: shared-types (typescript) at ~/projects/shared-types
โ Registered: frontend (typescript) at ~/projects/frontend
โ Registered: mobile (typescript) at ~/projects/mobile
Step 3 โ Define relationships
Tell upstreamiq which repo consumes which:
upstreamiq link frontend --consumes api-service
upstreamiq link frontend --consumes shared-types --type imports_types
upstreamiq link mobile --consumes api-service
upstreamiq link mobile --consumes shared-types --type imports_types
upstreamiq link api-service --consumes shared-types --type imports_types
Output:
โ Linked: frontend โ api-service (calls_rest)
โ Linked: frontend โ shared-types (imports_types)
โ Linked: mobile โ api-service (calls_rest)
Step 4 โ Generate upstream context
upstreamiq sync
Output:
Syncing downstream repos...
โ frontend/CLAUDE.upstream.md โ 2 upstreams, 84 lines (0.1s)
โ mobile/CLAUDE.upstream.md โ 2 upstreams, 79 lines (0.1s)
2 repos synced. Your AI agents have fresh upstream context.
Step 5 โ Wire it into Claude Code
Add one line to frontend/CLAUDE.md:
@CLAUDE.upstream.md
That's it. Every Claude Code session in frontend now starts with full knowledge of api-service and shared-types.
Step 6 โ Keep it fresh forever
upstreamiq watch
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ upstreamiq โ Watch Mode โ
โ Watching 4 repos | Interval: 30s โ
โ Ctrl+C to stop โ
โ โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโฃ
โ api-service โ synced (14:31:58) โ
โ shared-types โ synced (14:31:58) โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โก api-service changed (commit a3f9c2b: "refactor User model")
Detected: BREAKING change in type User
Before: email: string
After: emails: string[]
โ frontend/CLAUDE.upstream.md updated (breaking change highlighted)
โ mobile/CLAUDE.upstream.md updated (breaking change highlighted)
๐บ๏ธ Cross-repo task planning
Working on a feature that spans multiple repos? upstreamiq generates the right order automatically:
upstreamiq task "add phone number to user profiles"
Output file ~/repolink-tasks/add-phone-number-to-user-profiles.md:
# Cross-repo task: add phone number to user profiles
## Change sequence
| Step | Repo | Why this order |
|------|---------------|----------------------------------------|
| 1 | shared-types | Others import from here โ change first |
| 2 | api-service | Depends on shared-types |
| 3 | frontend | Depends on api-service |
| 4 | mobile | Depends on api-service |
---
## Step 1: shared-types
Open a Claude Code session in: ~/projects/shared-types
Suggested instruction for Claude Code:
"add phone number to user profiles โ start with the shared type definitions.
Add any new types, update existing ones for backward compatibility."
After completing:
โ Run: upstreamiq sync
โ This updates CLAUDE.upstream.md in api-service, frontend, mobile.
โ Then move to Step 2.
Work through one repo at a time. Each step's Claude session has perfect context from the previous step.
๐ What gets extracted
upstreamiq is smart about what it reads โ it never dumps the whole codebase.
| Language | What it extracts | How |
|---|---|---|
| TypeScript | export interface, export type, Express/Next.js routes |
Regex-based |
| Python | Pydantic BaseModel, @dataclass, FastAPI/Flask routes |
AST-based |
| OpenAPI | All paths, all components/schemas |
YAML/JSON parser |
| Any language | URL patterns, README API docs, .proto, .graphql |
Fallback scan |
Priority: if an OpenAPI spec exists โ use it first (highest quality). Then layer in language extractors for types.
๐ ๏ธ All commands
# Setup
upstreamiq add NAME PATH # Register a repo
upstreamiq link A --consumes B # Define A depends on B
upstreamiq unlink A --from B # Remove a dependency
upstreamiq remove NAME # Unregister a repo
upstreamiq init [PATH] # Auto-scan directory for git repos
# Core workflow
upstreamiq extract [REPO] # Extract API surface (run once to seed)
upstreamiq sync [REPO] # Write/update CLAUDE.upstream.md
upstreamiq watch # Watch + auto-sync on every commit
# Inspection
upstreamiq list # Show repos + dependency graph
upstreamiq show REPO # Show full extracted surface
upstreamiq changes [UPSTREAM] # Show recent breaking changes
upstreamiq status # Health check of your setup
# Planning
upstreamiq task "description" # Generate cross-repo task plan
๐ How upstreamiq fits into your project
your-projects/
โโโ shared-types/
โ โโโ src/types/user.ts
โ โโโ CLAUDE.md โ no upstream context needed (it's the source)
โ
โโโ api-service/
โ โโโ main.py
โ โโโ CLAUDE.md โ add: @CLAUDE.upstream.md
โ โโโ CLAUDE.upstream.md โ AUTO-GENERATED by upstreamiq
โ contains: shared-types interface
โ
โโโ frontend/
โ โโโ src/
โ โโโ CLAUDE.md โ add: @CLAUDE.upstream.md
โ โโโ CLAUDE.upstream.md โ AUTO-GENERATED by upstreamiq
โ contains: api-service + shared-types interface
โ
โโโ mobile/
โโโ src/
โโโ CLAUDE.md โ add: @CLAUDE.upstream.md
โโโ CLAUDE.upstream.md โ AUTO-GENERATED by upstreamiq
contains: api-service + shared-types interface
โ๏ธ Optional configuration
Add a .upstreamiq.toml to any repo to customise extraction:
[repo]
name = "api-service"
description = "FastAPI REST API"
language = "python"
api_spec = "openapi.yaml" # Use this OpenAPI spec (highest quality)
[surface]
include = ["src/", "app/", "models/"]
exclude = ["migrations/", "tests/", "scripts/"]
[conventions]
# These appear in CLAUDE.upstream.md of every downstream repo
consumer_notes = [
"All dates are ISO 8601 strings, never Unix timestamps",
"Pagination uses cursor-based pagination, not page numbers",
"Errors follow: { code: str, message: str, details?: dict }",
"All endpoints require Bearer token unless marked public",
]
๐ How it works under the hood
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ upstreamiq watch loop โ
โ โ
โ every 30s: โ
โ for each registered repo: โ
โ check git HEAD sha โ
โ if changed: โ
โ re-extract public surface (types + endpoints) โ
โ diff against cached surface โ
โ if breaking change detected: โ
โ record ChangeEvent (BREAKING) โ
โ add โ ๏ธ notice to downstream CLAUDE.upstream.md โ
โ else if additive change: โ
โ update CLAUDE.upstream.md silently โ
โ notify terminal โ
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Breaking change detection looks for:
- ๐ด BREAKING โ field removed, type changed, endpoint removed
- ๐ก ADDITIVE โ new field, new endpoint (safe, no notice)
- โช INTERNAL โ implementation change, no interface impact (ignored)
๐ฆ Requirements
- Python 3.11+
- Git (repos must be git repos for watch mode)
- Works with any AI coding agent that reads
CLAUDE.md(Claude Code, Cursor, Copilot, etc.)
License
MIT ยท Built by Raja Wajahat
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file upstreamiq-0.1.3.tar.gz.
File metadata
- Download URL: upstreamiq-0.1.3.tar.gz
- Upload date:
- Size: 41.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
83c08866c7950bf9cfc7fd560cfa4143349f8be1f073c43bb8b2cace0ff49f7d
|
|
| MD5 |
30978988f1f273224086bd614dea752a
|
|
| BLAKE2b-256 |
7e8eeeb67a744d25d54a38d09b24f366220eeb6c55186ba93a00f84470ca2467
|
File details
Details for the file upstreamiq-0.1.3-py3-none-any.whl.
File metadata
- Download URL: upstreamiq-0.1.3-py3-none-any.whl
- Upload date:
- Size: 39.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.15
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b9beb47b2570d61583621094f9ddbb93b37429bd4694ed369447b785833a4bfc
|
|
| MD5 |
458f25310f89bdec0717c9bae48a6fdb
|
|
| BLAKE2b-256 |
9fbda66611bd952a2c0d54dca7f29a7098d18c11cd197d07d25e34d1a79c384e
|