Agentic schema analyzer for ArangoDB: conceptual model + conceptual-to-physical mapping for transpilers.
Project description
arangodb-schema-analyzer (v0.1)
Standalone Python library that analyzes an ArangoDB database's physical schema and produces:
- a conceptual schema (entities, relationships, properties)
- a conceptual→physical mapping suitable for transpilers (Cypher, SPARQL, future)
- metadata (confidence, timestamp, analyzed collection counts, detected patterns)
Install
From source (this repo):
python -m pip install -e .
Optional LLM provider extras:
python -m pip install -e ".[openai]"
python -m pip install -e ".[anthropic]"
OpenRouter is also supported and requires no extra SDK (uses stdlib urllib).
MCP (Model Context Protocol) — optional stdio server wrapping the v1 JSON tool contract:
python -m pip install -e ".[mcp]"
arangodb-schema-analyzer-mcp
If you don't install a provider SDK (or you don't provide an API key), analysis degrades gracefully to deterministic baseline inference.
Usage
from arango import ArangoClient
from schema_analyzer import AgenticSchemaAnalyzer
client = ArangoClient(hosts="http://localhost:8529")
db = client.db("mydb", username="root", password="openSesame")
analyzer = AgenticSchemaAnalyzer(
llm_provider="openai", # or "anthropic" or "openrouter"
api_key=None, # e.g. os.environ["OPENAI_API_KEY"]
model="gpt-4o-mini",
cache={"type": "filesystem", "directory": ".schema-analyzer-cache"},
)
analysis = analyzer.analyze_physical_schema(
db,
timeout_ms=60_000,
sample_limit_per_collection=5,
)
print(analysis.metadata.confidence)
Tool usage (CLI)
This project can be called as a non-interactive tool (stdin JSON → stdout JSON) using the v1 contract under docs/tool-contract/v1/.
Install (editable):
python -m pip install -e .
Example (analyze) using the provided request example:
cat docs/tool-contract/v1/examples/request.analyze.json | arangodb-schema-analyzer --pretty
CLI options
arangodb-schema-analyzer [--request FILE] [--out FILE] [--pretty] [-v]
--request FILE— path to request JSON (default: read from stdin)--out FILE— write response JSON to file (default: stdout)--pretty— pretty-print JSON output-v— enable verbose logging
Evaluation CLI
Run analysis quality benchmarks against domain packs:
arangodb-schema-analyzer eval \
--provider openai \
--model gpt-4o-mini \
--report eval_report.json \
--baseline eval_baseline.json
Options: --url, --user, --password, --database, --domains, --sample-limit, --timeout-ms, --scale, --no-cleanup.
Domains included: healthcare, financial_fraud_detection, insurance, intelligence, network_asset_management.
Public API
Exports:
AgenticSchemaAnalyzer— main analyzer classConceptualSchema— conceptual schema dataclassPhysicalMapping— physical mapping dataclass with AQL helpersgenerate_schema_docs(analysis)— Markdown documentation generatorexport_mapping(analysis, target)— transpiler export (v0.1:cypher)export_conceptual_model_as_owl_turtle(analysis)— OWL Turtle exportregister_provider(name, ...)— register custom LLM providerslist_providers()— list registered LLM provider names
Configuration
Tunable defaults live in schema_analyzer/defaults.py. Key parameters:
| Parameter | Default | Description |
|---|---|---|
MAX_REPAIR_ATTEMPTS |
2 | LLM repair loop iterations |
LLM_TEMPERATURE |
0.0 | Sampling temperature |
DEFAULT_TIMEOUT_MS |
60000 | Analysis timeout (ms) |
DEFAULT_REVIEW_THRESHOLD |
0.6 | Confidence threshold for review_required |
DEFAULT_CACHE_TTL_SECONDS |
86400 | Cache TTL (seconds) |
Notes
- Secrets: API keys are read from config/env; never persisted by this library.
- AQL fragments: helper methods return AQL text + bind variables; collection names are passed via bind parameters.
- Graceful degradation: without an LLM provider, the analyzer returns deterministic baseline inference with
review_required=True.
Integration evaluation (Docker ArangoDB)
Bring up a local ArangoDB:
docker compose up -d
Run integration tests (opt-in):
export RUN_INTEGRATION=1
export ARANGO_URL=http://localhost:18529
export ARANGO_DB=schema_analyzer_it
export ARANGO_USER=root
export ARANGO_PASS=openSesame
pytest -q -m integration
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file arangodb_schema_analyzer-0.3.0.tar.gz.
File metadata
- Download URL: arangodb_schema_analyzer-0.3.0.tar.gz
- Upload date:
- Size: 63.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
18c38ccfea88e8e0a89be38c5d542caf1c145d17a0dbef9df2892cd6bd1f2a73
|
|
| MD5 |
04f9593b1f597426144b0a41ced79e8e
|
|
| BLAKE2b-256 |
1f7e350145b13400280185d57764b9813bf53f44c415cbedc2eff75438689bca
|
Provenance
The following attestation bundles were made for arangodb_schema_analyzer-0.3.0.tar.gz:
Publisher:
publish.yml on ArthurKeen/arango-schema-mapper
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
arangodb_schema_analyzer-0.3.0.tar.gz -
Subject digest:
18c38ccfea88e8e0a89be38c5d542caf1c145d17a0dbef9df2892cd6bd1f2a73 - Sigstore transparency entry: 1343148106
- Sigstore integration time:
-
Permalink:
ArthurKeen/arango-schema-mapper@d8a1802b3b89743a0a224bf0cd4294716be01074 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/ArthurKeen
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@d8a1802b3b89743a0a224bf0cd4294716be01074 -
Trigger Event:
workflow_dispatch
-
Statement type:
File details
Details for the file arangodb_schema_analyzer-0.3.0-py3-none-any.whl.
File metadata
- Download URL: arangodb_schema_analyzer-0.3.0-py3-none-any.whl
- Upload date:
- Size: 74.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
972586fcaa367832c07fa0185f8a7a22b39ea13833e6977a4bb2c21752f1770b
|
|
| MD5 |
c2d0e08a8729fe024340c623fe5d8c1a
|
|
| BLAKE2b-256 |
b57a372959f7e331dcb658ab17088f3d7c70b430d0dd32de555bdc965d50c98a
|
Provenance
The following attestation bundles were made for arangodb_schema_analyzer-0.3.0-py3-none-any.whl:
Publisher:
publish.yml on ArthurKeen/arango-schema-mapper
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
arangodb_schema_analyzer-0.3.0-py3-none-any.whl -
Subject digest:
972586fcaa367832c07fa0185f8a7a22b39ea13833e6977a4bb2c21752f1770b - Sigstore transparency entry: 1343148109
- Sigstore integration time:
-
Permalink:
ArthurKeen/arango-schema-mapper@d8a1802b3b89743a0a224bf0cd4294716be01074 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/ArthurKeen
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@d8a1802b3b89743a0a224bf0cd4294716be01074 -
Trigger Event:
workflow_dispatch
-
Statement type: