MCP tool for analyzing Karate Framework projects and generating dependency graphs
Project description
Karate Graph (MCP-Powered)
A Model Context Protocol (MCP) tool for AI agents to analyze Karate Framework projects, build dependency graphs, and generate interactive reports.
Why use this with AI?
- Impact analysis with dependency paths and source lines
- Multi-project graph exploration
- Search APIs, workflows, pages, Java/JS usages
- Failure hotspot and flaky-risk prioritization
- Reusable helper discovery before writing new code
Quick Start
- Install
pip install karate-graph
Install from TestPyPI (pre-release validation):
pip install --index-url https://test.pypi.org/simple/ --extra-index-url https://pypi.org/simple/ karate-graph==0.1.0
Verify installation:
karate-graph-mcp --help
python -c "import karate_graph_analyzer; print('ok')"
For local development:
git clone <repository-url>
cd karate-graph
pip install -r requirements-dev.txt
- Configure MCP client
{
"mcpServers": {
"karate-graph": {
"command": "karate-graph-mcp",
"args": [],
"env": {
"PYTHONPATH": "C:/path/to/repo/src",
"PYTHONIOENCODING": "utf-8"
}
}
}
}
- Ask your AI to:
- Register and analyze project
- Show impact if a component changes
- Find reusable helpers before creating new ones
Available AI Tools
register_projectanalyze_projectbulk_analyzeimpact_analysissearch_apisearch_workflowsearch_test_casesearch_java_usagesearch_js_usagesearch_error_patternsearch_reusable_functionchange_impact_previewtest_selection_suggestionfeature_intent_indexvariable_data_flow_traceassertion_mapcall_read_deep_contextai_feature_context_packfeature_behavior_mapscenario_similarity_mapfeature_reuse_advisordb_query_indexsearch_db_usagedb_data_flow_tracedb_assertion_mapdb_impact_previewtop_hotspotsprioritize_fix_queueflaky_risk
Reuse And Smart Retest
1) Reuse search
search_reusable_function(project_name, query, language, limit) now returns:
tagsaliasesusage_examplesstability_score
This helps AI pick existing helpers with better confidence.
2) Change impact preview
change_impact_preview(project_name, changed_paths, limit) maps changed files/components to impacted test cases via dependency paths.
3) Test selection suggestion
test_selection_suggestion(project_name, changed_paths, limit) suggests a compact high-signal rerun set.
Current strategy:
priority = trigger_count * 10 - min_depth
Feature Understanding For AI
These tools help AI understand Karate .feature files before editing or debugging:
feature_intent_index(project_name, query, limit)- summarizes scenario intent, step roles, API signals, data files, assertions, and call/read usage
variable_data_flow_trace(project_name, feature_path, scenario_tag, scenario_name, node_id, limit)- traces
def/setvariables from source expressions to usage lines
- traces
assertion_map(project_name, query, limit)- indexes
status,match, andassertchecks across feature files
- indexes
call_read_deep_context(project_name, feature_path, scenario_tag, scenario_name, node_id, max_depth, limit)- expands nested
call read(...)chains, including target feature/scenario context
- expands nested
ai_feature_context_pack(project_name, feature_path, scenario_tag, scenario_name, node_id, max_call_depth, limit)- returns an AI-ready pack with intent, variable flow, assertions, call/read chain, and graph context
feature_behavior_map(project_name, feature_path, scenario_tag, scenario_name, node_id, limit)- groups scenario behavior into preconditions, actions, expectations, plus data inputs and status expectations
scenario_similarity_map(project_name, query, limit, top_k)- finds similar scenarios by intent-keyword overlap to improve reuse and AI suggestion quality
feature_reuse_advisor(project_name, min_group_size, min_flow_length, limit, include_low_signal)- finds duplicate steps and repeated flows, indexes their locations, and returns AI-safe refactor plans
DB Understanding For AI
These tools help AI understand query usage, variable flow, and DB-related impact:
db_query_index(project_name, query, limit, include_components, link_status)- indexes DB query nodes and DB executor/components with operation/table/database/host/dialect/provider/risk/usage/link status
search_db_usage(project_name, query, limit, link_status)- searches DB usage by table, operation, host, file path, or query keywords
db_data_flow_trace(project_name, feature_path, scenario_tag, scenario_name, node_id, limit)- traces DB-related variables, DB call steps, and DB assertions inside selected scenarios
db_assertion_map(project_name, query, limit)- indexes DB-related assertion steps and links them to DB variables/query signatures
db_impact_preview(project_name, changed_entities, limit)- previews impacted test cases from changed DB entities (tables/schemas/hosts/DB feature paths)
Dialect and DB-type detection is included in DB outputs:
- SQL dialects: PostgreSQL, MySQL, MariaDB, Oracle, SQL Server, SQLite, DB2, H2, Redshift, Snowflake, ClickHouse, and generic SQL
- NoSQL/cache/search/graph stores: MongoDB, Redis, DynamoDB, Cassandra/CQL, Elasticsearch/OpenSearch, Neo4j/Cypher
- Returned fields include
db_type,dialect,provider,dialect_confidence,dialect_signals,entity_type, andentity_name - Returned DB index rows also include
link_status:linked,orphan,component, ordemo - Use
link_status="default"to focus on impact-first rows (linked+orphan) while keeping component/demo context searchable when needed - Detection uses Strategy + Registry + Value Object so new DB providers can be added as isolated strategies.
Visual Reports
Interactive HTML graph output is generated under each project output/ folder.
- Test cases are shown as
@TEST-ID - Scenario namewhen Jira/test-case tags exist. - Dashboard search supports
TEST-ID,@TEST-ID, scenario name, component name, and feature path.
Publishing
Build and validate locally first:
python -m pip install build twine
python -m build
python -m twine check dist/*
Publish to TestPyPI before PyPI:
python -m twine upload --repository testpypi dist/*
python -m twine upload dist/*
Use an API token for upload. The package ships report assets (style.css, script.js) via package data.
Project Structure
karate-graph/
|-- src/karate_graph_analyzer/
| |-- mcp_server.py
| |-- mcp_interface/
| |-- parser/
| |-- graph/
| `-- visualization/
|-- output/
`-- tests/
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file karate_graph-0.1.1.tar.gz.
File metadata
- Download URL: karate_graph-0.1.1.tar.gz
- Upload date:
- Size: 165.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7a75e1ab0721cbf15487872f16c0962234620987645bb0594a71f78088664ea6
|
|
| MD5 |
378345f0247ffcf613c7ffab000a266b
|
|
| BLAKE2b-256 |
ade9d2e8c42b24e8160bedd3385fcd7c4cf750cf68a9dbdcf724812d3047e998
|
File details
Details for the file karate_graph-0.1.1-py3-none-any.whl.
File metadata
- Download URL: karate_graph-0.1.1-py3-none-any.whl
- Upload date:
- Size: 195.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ea750dbf0975116b66c2285975f39876993ee45a3e204b22f0a0c4d291719895
|
|
| MD5 |
d26873555688abaf57186a7f3ea483a1
|
|
| BLAKE2b-256 |
5ba3d372308c429c745047e17a85a8ccbaafb0e004f4377159420abee594c179
|