Skip to main content

Universal switchboard for the Context-Pipe Protocol (CPP)

Project description

⛓️ Context-Pipe

The Universal Standard for Context Engineering.

CI Tests Python License OSI

context-pipe is a high-performance orchestration layer designed to bring the Unix Philosophy to the AI context window. It allows you to connect AI tools (Spokes) into a series of Streams, ensuring that data is refined, distilled, and noise-free before it ever reaches the LLM.


🚀 The Vision

In the "Studio of Two" philosophy, we build Systems, not Patches. context-pipe is the system that manages the flow of context, allowing you to chain specialized tools (Refineries) like semantic-sift into your agentic workflows with zero token overhead and millisecond latency.


🛠️ Core Components

1. The Context-Pipe Protocol (CPP)

A language-agnostic standard based on stdin and stdout. If a tool can read text and emit text, it can be a node in the pipe.

2. The Universal Switchboard

A lightweight orchestrator that manages multi-node data streams (e.g., [Ingest] -> [Mask] -> [Rerank] -> [Distill]).

3. Subconscious Interceptors

Universal hooks that automatically apply your context pipes to any MCP tool call in IDEs like Cursor, VS Code, and Windsurf.


🏗️ Getting Started

1. Installation

Clone the repository and install the orchestrator:

git clone https://github.com/luismichio/context-pipe.git
cd context-pipe
# Dedicated environment (Recommended)
python -m venv venv
.\venv\Scripts\activate
pip install .

🐍 Python Environment Guidance

Choosing the right Python path for your MCP configuration is critical for stability:

Setup Type Path Example Pros Cons
Dedicated Venv .../context-pipe/venv/Scripts/python.exe Isolated dependencies, no version conflicts with other tools. Slightly more disk space.
Global Python C:/Users/User/AppData/Local/.../python.exe Shared libraries, fast setup. High risk of version conflicts.

Recommendation: Always use the Dedicated Venv path in your mcp_config.json to ensure the orchestrator is fast and stable.

For development tools (pytest, ruff, mypy):

pip install .[dev]

2. Connect the MCP

CRITICAL: For exact configuration paths for Cursor, Gemini, OpenCode, VS Code, and Claude, reference the Master Configuration Matrix.

3. Connect a Refinery

Context-Pipe is the "Switchboard," but it needs a "Refinery" to distill data. Semantic-Sift is the flagship intelligence engine for this ecosystem. It uses heuristic sieves and neural models (BERT/ONNX) to incinerate noise (timestamps, boilerplate) while preserving 95% of the signal.

# Clone the Sift repository to gain access to the Rust sidecar and models
git clone https://github.com/luismichio/semantic-sift.git
cd semantic-sift
pip install .[neural,multi-modal]

4. Configure your first Pipe

Edit pipes.json (see pipes.json.example) to define your high-fidelity context streams.

5. Auto-Onboard

Once connected, ask your AI Assistant to configure your workspace:

"Run pipe_onboard(environment='Cursor') to configure this project."


📚 Documentation

Detailed documentation is available in the doc/ directory.


💻 Terminal Usage

Context-Pipe follows the Unix Philosophy. You can use it as a standalone utility or inside existing bash chains.

# Sift a log file through the 'standard-distill' pipe
cat app.log | context-pipe run standard-distill

# Process a document through a multi-node refinery
cat spec.pdf | context-pipe run full-refinery > distilled_spec.md

# Pre-distill code for manual copy-pasting
cat server.py | context-pipe run semantic-refinery | clip

🔗 Advanced Node Types

Context-Pipe supports more than just simple binaries. You can chain standard OS tools and expert mandates.

1. Bash Nodes (shell: true)

Execute arbitrary shell commands as part of your pipe.

{ "cmd": "grep 'ERROR'", "shell": true }

2. Skill Nodes

Apply an "Expert Lens" to the context by injecting specialized skill mandates.

{ "cmd": "context-pipe-skill", "args": ["security-auditor"] }

🔗 The Ecosystem (Studio of Two)

Context-Pipe is a foundational member of the Studio of Two infrastructure. It is designed to work in high-fidelity harmony with:

  • Semantic-Sift: The intelligent refinery for agentic context. Sift is the flagship distillation engine for Context-Pipe, providing the mathematical and neural sifting nodes used in our standard templates.

⚖️ Licensing

context-pipe is licensed under the Apache License 2.0. It is an "Open Source, Closed Contribution" project maintained by the Studio of Two to ensure architectural integrity.


Building High-Fidelity Infrastructure for the Intelligence Age.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcp_context_pipe-0.1.0.tar.gz (23.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mcp_context_pipe-0.1.0-py3-none-any.whl (24.0 kB view details)

Uploaded Python 3

File details

Details for the file mcp_context_pipe-0.1.0.tar.gz.

File metadata

  • Download URL: mcp_context_pipe-0.1.0.tar.gz
  • Upload date:
  • Size: 23.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for mcp_context_pipe-0.1.0.tar.gz
Algorithm Hash digest
SHA256 e131eeec5dd821c9863fc1e8d0638e2d3714af35112439182e1ef341d8037334
MD5 16dbd1f78150c1fe1a3731692e5443ac
BLAKE2b-256 d49de70220be570fffa06429e1549ee97e7ba8fbd6386eb3053fa0e1cba8a789

See more details on using hashes here.

Provenance

The following attestation bundles were made for mcp_context_pipe-0.1.0.tar.gz:

Publisher: release.yml on luismichio/context-pipe

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mcp_context_pipe-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for mcp_context_pipe-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9e93eefa9fc53ac5b92ab7bf3199650caf74ce816b1651e9380267d413de6558
MD5 55edb4f36372bd658c45e6bad939519f
BLAKE2b-256 e20389292ecf9ab379487bc8e0938c1b375282e4006c167b9b2a0dcfebef0ef5

See more details on using hashes here.

Provenance

The following attestation bundles were made for mcp_context_pipe-0.1.0-py3-none-any.whl:

Publisher: release.yml on luismichio/context-pipe

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page