Skip to main content

Full Spectrum Graph Sieve - Automated Technical Term Extraction and Relationship Mapping

Project description

Graph-Sieve 🕸️📊

Full Spectrum Graph Sieve - Automated Technical Term Extraction and Relationship Mapping

graph-sieve is a powerful knowledge management utility and service designed to extract high-fidelity, relationship-aware domain knowledge from unstructured documents (.docx, .pptx, .msg, .pdf, .one). Using a multi-gate verifiable pipeline, it builds a structured knowledge graph that preserves technical context and organizational links.

✨ Core Capabilities

  • 🔍 Multi-Gate Pipeline: A 5-gate extraction flow (Strategic Sieve -> Batch Extraction -> Multi-Source Validation -> Alias Resolution -> Global Synthesis) ensuring high-fidelity term capture with minimal hallucinations.
  • 📄 Multi-Format Support: Native handling of PDF, PPTX, DOCX, MSG, and OneNote (.one) files. Leverages Microsoft MarkItDown for deep document parsing and OCR.
  • 🗺️ Relationship Mapping: Beyond simple term lookup—automatically maps how terms relate (e.g., SUPERSEDES, DEPENDS_ON, HAS_EXPERT).
  • 🌐 Global Synthesis: Automatically clusters the graph into communities and generates executive summaries and a global project narrative.
  • 🇮🇱 Hebrew & Mixed-Language Support: Specialized Bi-Directional (BIDI) support for Hebrew-English technical documents, ensuring technical terms are correctly extracted from mixed-language contexts.
  • ⚙️ Flexible LLM Backend: Run locally with Ollama/vLLM for privacy, or use OpenAI for scale.
  • 📈 Interactive Visualization: Generate dynamic, relationship-aware graph visualizations via PyVis.
  • 🤖 MCP Server: Integrated Model Context Protocol (MCP) server for seamless integration with AI agents like Claude Desktop or Gemini CLI.

🚀 Quick Start

  1. Configure Your LLM: Create a .env file in your working directory:

    LLM_PROVIDER=openai
    OPENAI_API_KEY=your_key_here
    MODEL_NAME=gpt-4o-mini
    

    Or use local Ollama (default):

    LLM_PROVIDER=ollama
    OLLAMA_BASE_URL=http://localhost:11434
    MODEL_NAME=llama3
    
  2. Scan a Directory:

    graph-sieve-scan ./path/to/documents --db my_knowledge.db
    
  3. Visualize the Results:

    graph-sieve-visualize --db my_knowledge.db
    

🛠️ CLI Command Reference

  • graph-sieve-scan <path>: Extract terms from a directory or file.
    • --db <path>: Path to the SQLite database (default: platform-standard data dir).
    • --seed <path>: High-authority documents to process first.
    • --whitelist <path>: Text file with terms to always include.
    • --retry-failed: Retry processing chunks from the Dead Letter Queue (DLQ).
  • graph-sieve-lookup <term>: Query a term, its definition, and its graph context.
  • graph-sieve-visualize: Generate an interactive HTML graph.
  • graph-sieve-mcp: Launch the MCP server.
  • graph-sieve-whois <term>: Identify experts, owners, and organizations responsible for a term.

📖 Advanced Workflow

💎 Seed Documents

Use the --seed flag to process "Golden" documents (specs, architecture docs) before general notes. This sets the ground truth for term definitions and relationships.

🔗 Alias Resolution & Canonicalization

Graph-Sieve automatically performs LLM-verified canonicalization. If it finds "AIP" and "AI Platform" in the same context, it will attempt to merge them into a single canonical entry with appropriate aliases.

🆘 Dead Letter Queue (DLQ)

If an LLM call fails or a chunk is too complex, it's pushed to the DLQ. Use graph-sieve-scan --retry-failed to re-process these chunks after updating your configuration or models.

⚙️ Configuration (Environment Variables)

Variable Description Default
LLM_PROVIDER openai, ollama, or vllm openai
OPENAI_API_KEY Required if using OpenAI None
OLLAMA_BASE_URL URL for Ollama API http://localhost:11434
MODEL_NAME Model to use for extraction gpt-4o-mini
STORAGE_DIR Directory for graph data Platform-specific

🧩 AI Agent Integration

Add Graph-Sieve to your MCP-compatible agent's configuration:

{
  "mcpServers": {
    "graph-sieve": {
      "command": "graph-sieve-mcp",
      "args": []
    }
  }
}

License

MIT License. See LICENSE for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

graph_sieve-1.2.0.tar.gz (73.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

graph_sieve-1.2.0-py3-none-any.whl (50.0 kB view details)

Uploaded Python 3

File details

Details for the file graph_sieve-1.2.0.tar.gz.

File metadata

  • Download URL: graph_sieve-1.2.0.tar.gz
  • Upload date:
  • Size: 73.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.12

File hashes

Hashes for graph_sieve-1.2.0.tar.gz
Algorithm Hash digest
SHA256 dbd189139cc64899a04e33255d2715babedbb762276161961d78c5a4ac9f29c9
MD5 74f5acb402612c9aab4779b1ae8b555e
BLAKE2b-256 bcc5a704f1dc3b572c7913f55a43be12946583eb8aacd686c5b0c1aafb75d744

See more details on using hashes here.

File details

Details for the file graph_sieve-1.2.0-py3-none-any.whl.

File metadata

  • Download URL: graph_sieve-1.2.0-py3-none-any.whl
  • Upload date:
  • Size: 50.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.12

File hashes

Hashes for graph_sieve-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 6e30ee54e6d44ab511e5085acaec4ae10dea7a712ee97ac36ee9d1137f70a034
MD5 2e1b7488127f03fb0b2b3aa7460cb336
BLAKE2b-256 5c4b186f00c099f0382dfe57393a31bb974931bb6c3c38b3b03b10185f8bf137

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page