Skip to main content

Universal switchboard for the Context-Pipe Protocol (CPP)

Project description

⛓️ Context-Pipe

The Universal Standard for Context Engineering.

CI Tests Python License OSI

context-pipe is a high-performance orchestration layer designed to bring the Unix Philosophy to the AI context window. It allows you to connect AI tools (Spokes) into a series of Streams, ensuring that data is refined, distilled, and noise-free before it ever reaches the LLM.


🚀 The Vision

In the "Studio of Two" philosophy, we build Systems, not Patches. context-pipe is the system that manages the flow of context, allowing you to chain specialized tools (Refineries) like semantic-sift into your agentic workflows with zero token overhead and millisecond latency.


🛠️ Core Components

1. The Context-Pipe Protocol (CPP)

A language-agnostic standard based on stdin and stdout. If a tool can read text and emit text, it can be a node in the pipe.

2. The Universal Switchboard

A lightweight orchestrator that manages multi-node data streams (e.g., [Ingest] -> [Mask] -> [Rerank] -> [Distill]).

3. Subconscious Interceptors

Universal hooks that automatically apply your context pipes to any MCP tool call in IDEs like Cursor, VS Code, and Windsurf. For OpenCode, the AGENTS.md SOP mandate (pipe_read_file for all file reads) is the active strategy until transparent plugin interception is supported upstream.


🏗️ Getting Started

1. Installation

Option A: Quick Install (PyPI)

pip install mcp-context-pipe
pip install semantic-sift

Option B: Sovereign Pattern (Recommended for Studio of Two) Clone both repos side-by-side. The context-pipe venv acts as the master environment holding both packages. See Section 0 of the Operator's Guide for the full sequence.

# 1. Clone both repos
git clone https://github.com/luismichio/context-pipe.git
git clone https://github.com/luismichio/semantic-sift.git

# 2. Master venv in context-pipe — holds both packages
cd context-pipe
python -m venv venv
# Windows:
.\venv\Scripts\activate
# macOS/Linux:
# source venv/bin/activate
pip install -e .
pip install -e ../semantic-sift  # semantic-sift-cli lands in context-pipe/venv/Scripts/ (Win) or venv/bin/ (Mac/Linux)

# 3. ML runtime venv in semantic-sift (Python 3.12 for torch/CUDA compatibility)
cd ../semantic-sift
python3.12 -m venv venv312
# Windows:
.\venv312\Scripts\activate
# macOS/Linux:
# source venv312/bin/activate
pip install mcp
pip install -e .[neural]         # torch, transformers, llmlingua

Note: The package name on PyPI is mcp-context-pipe but the installed module is context_pipe. The semantic-sift-cli binary is registered only in the venv where semantic-sift is pip-installed (step 2 above). Both pipes.json files must reference that absolute path.

2. Connect the MCP

CRITICAL: For exact configuration paths for Cursor, Gemini, OpenCode, VS Code, and Claude, reference the Master Configuration Matrix.

3. Connect a Refinery

Context-Pipe is the "Switchboard," but it needs a "Refinery" to distill data. Semantic-Sift is the flagship intelligence engine for this ecosystem. It uses heuristic sieves and neural models (BERT/ONNX) to incinerate noise (timestamps, boilerplate) while preserving 95% of the signal.

Note: In the Sovereign Pattern, semantic-sift is cross-installed into context-pipe/venv (step 2 above). Context-Pipe will also auto-discover a separately installed semantic-sift-cli across all known locations (system PATH, pipx, sibling venv directories) via pipe_onboard or pipe_verify.

4. Verify the Installation

After installing both packages, ask your AI assistant to verify the full stack:

"Run pipe_verify() to confirm the installation."

This will report the health of every component and automatically link semantic-sift-cli into pipes.json if it was found in a separate environment.

5. Configure your first Pipe

Edit pipes.json (see pipes.json.example) to define your high-fidelity context streams.

6. Auto-Onboard

Once connected, ask your AI Assistant to configure your workspace:

"Run pipe_onboard(environment='Cursor') to configure this project."


📚 Documentation

Detailed documentation is available in the doc/ directory.


💻 Terminal Usage

Context-Pipe follows the Unix Philosophy. You can use it as a standalone utility or inside existing bash chains.

# Sift a log file through the 'standard-distill' pipe
cat app.log | context-pipe run standard-distill

# Process a document through a multi-node refinery
cat spec.pdf | context-pipe run full-refinery > distilled_spec.md

# Pre-distill code for manual copy-pasting
cat server.py | context-pipe run semantic-refinery | clip

🔗 Advanced Node Types

Context-Pipe supports more than just simple binaries. You can chain standard OS tools and expert mandates.

1. Bash Nodes (shell: true)

Execute arbitrary shell commands as part of your pipe.

{ "cmd": "grep 'ERROR'", "shell": true }

2. Skill Nodes

Apply an "Expert Lens" to the context by injecting specialized skill mandates.

{ "cmd": "context-pipe-skill", "args": ["security-auditor"] }

🔗 The Ecosystem (Studio of Two)

Context-Pipe is a foundational member of the Studio of Two infrastructure. It is designed to work in high-fidelity harmony with:

  • Semantic-Sift: The intelligent refinery for agentic context. Sift is the flagship distillation engine for Context-Pipe, providing the mathematical and neural sifting nodes used in our standard templates.

⚖️ Licensing

context-pipe is licensed under the Apache License 2.0. It is an "Open Source, Closed Contribution" project maintained by the Studio of Two to ensure architectural integrity.


Building High-Fidelity Infrastructure for the Intelligence Age.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcp_context_pipe-0.1.1.tar.gz (27.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mcp_context_pipe-0.1.1-py3-none-any.whl (27.9 kB view details)

Uploaded Python 3

File details

Details for the file mcp_context_pipe-0.1.1.tar.gz.

File metadata

  • Download URL: mcp_context_pipe-0.1.1.tar.gz
  • Upload date:
  • Size: 27.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for mcp_context_pipe-0.1.1.tar.gz
Algorithm Hash digest
SHA256 298d8d193401c60f25e5cd38c71433387b0a98acdc0472dcfab46d8477c3ba73
MD5 8ff069d298440f7c144fa47c9b859e11
BLAKE2b-256 e918a180128af47bc77f6262e658339253b10d7e2befd89539aec5989f4389bf

See more details on using hashes here.

Provenance

The following attestation bundles were made for mcp_context_pipe-0.1.1.tar.gz:

Publisher: release.yml on luismichio/context-pipe

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mcp_context_pipe-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for mcp_context_pipe-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 d58f5ae7011db1b67bc4f177c9bab8a40e1e5af34aef84fd1f7e2bbe9b087929
MD5 87e789b7ee6673b422f0891e991eec3f
BLAKE2b-256 278fac87accf435830a4dfd86bed7a186cc64786e567241f11b8a6d077ec7b64

See more details on using hashes here.

Provenance

The following attestation bundles were made for mcp_context_pipe-0.1.1-py3-none-any.whl:

Publisher: release.yml on luismichio/context-pipe

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page