AI-native process classification MCP Server
Project description
O'Process
AI-native process classification MCP Server. Query 2,436 processes and 3,284 KPIs from APQC PCF 8.0 + ITIL V5 + SCOR DS 14.0 + AI-era extensions.
Version: 0.4.0 | MCP SDK: Anthropic official mcp 1.26.0 | Protocol: 2025-11-25 | Coverage: 85%+
What It Does
O'Process gives AI assistants (Claude, GPT, etc.) real-time access to enterprise process knowledge. Connect it as an MCP Server, then ask natural language questions — the AI will call the right tools automatically.
Core capabilities:
- Process Search — "采购流程有哪些?" → returns matching process nodes with hierarchy, description, and confidence score
- Process Tree Navigation — browse the 5-level taxonomy (L1 categories → L5 activities)
- KPI Recommendations — get metrics for any process node (name, unit, formula, direction)
- Role-Process Mapping — "HRBP manages which processes?" → ranked list with confidence scores
- Process Comparison — side-by-side diff of 2+ process nodes across all attributes
- Responsibility Document — generate complete job descriptions with provenance appendix
Why It Matters
| Without O'Process | With O'Process |
|---|---|
| Manually search APQC PCF Excel (2017 rows) | Natural language query, instant results |
| Guess which KPIs apply to a process | Structured KPI suggestions from 3,284 metrics |
| Write job descriptions from scratch | Auto-generated with process-backed provenance |
| Cross-reference APQC + ITIL + SCOR manually | Unified 2,436-node taxonomy, one query |
Use Cases
Management Consulting — Process diagnostics. A manufacturing company's delivery cycle is 30% slower than competitors. Use search_process to locate SCOR Plan/Deliver/Make nodes, then get_kpi_suggestions to build a measurement framework.
HR Digital Transformation — Role-process mapping. CHRO needs to know what processes HR actually owns. Use get_process_tree on node 7.0 (Human Capital) to get the full L1→L4 hierarchy, then map_role_to_processes to map "HRBP" to standard processes.
Legal Due Diligence — Compliance audit. Cross-border M&A requires checking 12+ regulatory domains. Use search_process to locate relevant PCF nodes (corporate governance, tax, labor, environmental), then compare_processes to identify coverage gaps.
Internal Audit — KPI system design. Use get_kpi_suggestions for each process node, review coverage across efficiency/quality/cost/timeliness dimensions, identify missing metrics.
Quick Start
# Install
uv sync
# Run MCP Server (stdio — default)
uv run python -m oprocess.server
# Run with SSE transport
uv run python -m oprocess.server --transport sse --port 8000
# Run with streamable-http transport
uv run python -m oprocess.server --transport streamable-http --port 8000
Claude Desktop Configuration
Add to claude_desktop_config.json:
{
"mcpServers": {
"oprocess": {
"command": "uv",
"args": ["run", "python", "-m", "oprocess.server"],
"cwd": "/path/to/O-Process"
}
}
}
Tools
8 MCP tools with full input validation, structured output, and ToolAnnotations:
| Tool | Description | Key Parameters |
|---|---|---|
search_process |
Semantic search for process nodes | query (1-500 chars), lang (zh/en), limit (1-50), level (1-5) |
get_process_tree |
Get process subtree with children | process_id (e.g. "1.0"), max_depth (1-5) |
get_kpi_suggestions |
Get KPIs for a process node | process_id |
compare_processes |
Compare 2+ process nodes side-by-side | process_ids (comma-separated, 2+) |
get_responsibilities |
Generate role responsibilities | process_id, lang, output_format (json/markdown) |
map_role_to_processes |
Map job role to relevant processes | role_description (1-500 chars), lang, limit, industry |
export_responsibility_doc |
Export full responsibility document | process_ids (1+), lang, role_name |
health_check |
Health check — server status and data counts | (none) |
All tools return structured content (structuredContent + text) with result, provenance_chain, session_id, and response_ms. Each tool has outputSchema auto-generated from Pydantic models.
Invalid inputs raise ToolError (Tool Execution Error for LLM self-correction). All tools are annotated with readOnlyHint, idempotentHint, destructiveHint, and openWorldHint.
Prompts
3 guided prompt templates for common workflows:
| Prompt | Description | Parameters |
|---|---|---|
analyze_process |
Step-by-step process analysis workflow | process_id, lang |
generate_job_description |
Role responsibility document generation | process_ids, role_name, lang |
kpi_review |
KPI review and gap analysis workflow | process_id, lang |
Resources
6 MCP resources for direct data access:
| URI | Title | Description |
|---|---|---|
oprocess://process/{id} |
Process Node | Complete process node data |
oprocess://category/list |
Category List | All L1 process categories |
oprocess://role/{role_name} |
Role-Process Mapping | Process mappings for a role |
oprocess://audit/session/{id} |
Audit Session Log | Audit log for a session |
oprocess://schema/sqlite |
SQLite Schema | Public table schema (processes, kpis) |
oprocess://stats |
Framework Statistics | Process/KPI counts and version |
Authentication
For HTTP transports (SSE, streamable-http), authentication is handled at the reverse-proxy layer (e.g. Caddy with forward_auth or bearer_token directive). See deploy/README.md for Caddy configuration.
stdio mode requires no authentication (local process communication).
Environment Variables
| Variable | Required | Description |
|---|---|---|
GOOGLE_API_KEY |
No | Enables semantic vector search (gemini-embedding-001). Without it, search falls back to SQL LIKE matching — all features still work. |
OPROCESS_API_KEY |
No | Bearer token for reverse-proxy auth layer. |
OPROCESS_ALLOWED_ORIGINS |
No | Comma-separated allowed origins for CORS. |
LOG_LEVEL |
No | Logging level (default: INFO) |
No API key is required to run the server. All 8 tools work out of the box. Setting
GOOGLE_API_KEYupgradessearch_processandmap_role_to_processesfrom text matching to semantic vector search.
Logging
Structured JSON logging (no extra dependencies):
# Default level: INFO (all tool calls logged)
export LOG_LEVEL=DEBUG # DEBUG, INFO, WARNING, ERROR
# Output format (JSON):
# {"ts":"2026-03-16 12:00:00","level":"INFO","logger":"oprocess","msg":"tool.execute","tool":"search_process","session_id":"...","ms":4}
Configuration
Server behavior can be tuned via [tool.oprocess] in pyproject.toml:
| Key | Default | Description |
|---|---|---|
boundary_threshold |
0.45 |
Cosine distance threshold for BoundaryResponse |
audit_log_enabled |
true |
Enable/disable SessionAuditLog |
default_language |
"zh" |
Default language (zh/en) |
rate_limit_max_calls |
60 |
Max tool calls per window |
rate_limit_window_seconds |
60 |
Rate limit window duration (seconds) |
Rate limiting is enforced via thread-safe RateLimiter. Exceeding the limit returns MCP error code -32000.
Governance-Lite
Transparent governance layer (non-blocking):
- SessionAuditLog — Append-only invocation log per session (failure-tolerant with escalation)
- BoundaryResponse — Structured fallback when semantic confidence is low (threshold: 0.45)
- ProvenanceChain — Derivation trail attached to every tool response
- Prompt Injection Mitigation — Description fields sanitized with
[DATA_BEGIN]/[DATA_END]markers
Data Sources
| Source | Entries | License |
|---|---|---|
| APQC PCF 8.0 | 2,017 processes | Royalty-free with attribution |
| ITIL V5 | 145 nodes | Practice names only (industry terms) |
| SCOR DS 14.0 | 175 nodes | Open-access standard |
| AI-era extensions | 99 nodes | Original (MIT) |
| Total | 2,436 processes | |
| KPI metrics | 3,284 | From APQC PCF 8.0 Metrics |
Bilingual: Chinese (zh) + English (en).
Third-Party Attribution
APQC Process Classification Framework® (PCF) is an open standard developed by APQC, a nonprofit that promotes benchmarking and best practices worldwide. Used under APQC's royalty-free license for derivative works. To download the original PCF, visit apqc.org/pcf.
ITIL® is a registered trademark of PeopleCert group. This project references ITIL V5 practice names as industry-standard terminology. All descriptions are independently written and are not reproduced from ITIL publications.
SCOR® (Supply Chain Operations Reference) is a product of ASCM. This project references SCOR DS 14.0 process names as open-access industry-standard terminology. All descriptions are independently written.
Development
# Install dependencies
uv sync
# Lint
ruff check .
# Test (217 tests, 85%+ coverage)
pytest
# Full check (lint + test + benchmark)
ruff check . && pytest && pytest --benchmark-only
Project Structure
src/oprocess/
├── server.py # MCP entry point (stdio/SSE/HTTP)
├── gateway.py # ToolGatewayInterface + PassthroughGateway
├── config.py # pyproject.toml config loader
├── validators.py # Input validation + sanitization
├── prompts.py # 3 MCP prompt templates
├── tools/
│ ├── registry.py # 6 tool registrations
│ ├── search.py # search_process + map_role_to_processes
│ ├── resources.py # 6 MCP resources
│ ├── _models.py # Pydantic response models
│ ├── export.py # Responsibility document builder
│ ├── helpers.py # Provenance + comparison utilities
│ ├── serialization.py # ToolResponse → ToolEnvelope
│ └── rate_limit.py # Thread-safe rate limiter
├── governance/
│ ├── audit.py # SessionAuditLog
│ ├── boundary.py # BoundaryResponse
│ └── provenance.py # ProvenanceChain
└── db/
├── connection.py # SQLite + sqlite-vec connection
├── queries.py # All SQL queries
├── embedder.py # Gemini embedding (with timeout + retry)
└── vector_search.py # sqlite-vec vector search
Tech Stack
- Runtime: Python 3.10+
- MCP SDK: Anthropic official
mcp1.26.0 (mcp.server.fastmcp) - Protocol: MCP 2025-11-25 (structured output, ToolAnnotations, title)
- Validation: Pydantic 2.x (
Annotated[..., Field(...)]) - Database: SQLite + sqlite-vec (optional vector search)
- Embeddings: gemini-embedding-001 (768-dim, 5s timeout, 2-retry)
- Packaging: uv + hatchling
Deployment
See deploy/README.md for production deployment on Alibaba Cloud with Caddy reverse proxy.
Live endpoint: http://8.138.46.17/mcp (streamable-http)
License
MIT — applies to all source code and AI-era original content.
Third-party framework data (APQC PCF, ITIL, SCOR) is used under their respective licenses. See Third-Party Attribution for details.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file oprocess-0.4.0.tar.gz.
File metadata
- Download URL: oprocess-0.4.0.tar.gz
- Upload date:
- Size: 89.7 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.27 {"installer":{"name":"uv","version":"0.9.27","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"22.04","id":"jammy","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7aa07589b1029d8051b40c3d6a4cd9f255d9e627106e4b638f1d23db95c5181e
|
|
| MD5 |
5807fe3966bf8af2a7e907ca032c685d
|
|
| BLAKE2b-256 |
0a0878e739e44318842acc1e7d23cfa2e3e24dfda0cab5c276704be400d7ec3a
|
File details
Details for the file oprocess-0.4.0-py3-none-any.whl.
File metadata
- Download URL: oprocess-0.4.0-py3-none-any.whl
- Upload date:
- Size: 41.1 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.27 {"installer":{"name":"uv","version":"0.9.27","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"22.04","id":"jammy","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ad9f2465504982860c196a31f27512350f3d270264e23a9d1cabdeb4d96e50c3
|
|
| MD5 |
d3d6224e03854da67c8ee0493a2b7237
|
|
| BLAKE2b-256 |
70fb2930111e37e4105d88ff3976254ffe7e134970bb44b83453b1e886e3ef04
|