Skip to main content

Manage your git projects

Project description

Repository Manager - A2A | AG-UI | MCP

PyPI - Version MCP Server PyPI - Downloads GitHub Repo stars GitHub forks GitHub contributors PyPI - License GitHub

GitHub last commit (by committer) GitHub pull requests GitHub closed pull requests GitHub issues

GitHub top language GitHub language count GitHub repo size GitHub repo file count (file type) PyPI - Wheel PyPI - Implementation

Version: 1.3.53

Overview

  • Pydantic Graph Architecture: 19 specialized domain nodes (Git, File, Workspace, and 15+ integrated engineering skills) for intelligent, granular task routing.
  • Declarative Workspace: Manage your entire ecosystem via workspace.yml, validated by strict Pydantic V2 models.
  • Idempotent Synchronization: One-click setup that intelligently clones missing repositories and pulls existing ones into their correct hierarchical paths.
  • Workspace Visualization:
    • ASCII Tree: Generate beautiful folder structures directly in the CLI or via MCP.
    • Mermaid Diagrams: Export your workspace model as a visual graph for documentation.
  • Integrated Skills: Native support for agent-builder, mcp-builder, web-search, and more, coupled with expert documentation for FastMCP, Pydantic AI, and Docker.
  1. Nodes: Specialized agents for high-context domains (e.g., GitOpsNode, KnowledgeNode).
  2. Router: Automatically directs intent based on tool tags (git_operations, workspace_management, etc.).
  3. Engine: The core WorkspaceManager processes the workspace.yml model to maintain state.

🛠️ Usage

Workspace Configuration (workspace.yml)

Define your world in a single file:

name: "My Workspace"
path: "./workspace"
repositories:
  - url: "https://github.com/org/repo-core.git"
subdirectories:
  agents:
    repositories:
      - url: "https://github.com/org/agent-1.git"
maintenance:
  phases:
    - name: "Phase 1: Core"
      phase: 1
      project: "repo-core"

MCP

AI Prompt:

Setup my workspace using the workspace.yml configuration. Also, install and validate all projects in the workspace.

AI Response:

Workspace setup complete: Missing repositories have been cloned and existing ones updated.
Bulk operations finished: All projects installed and validated (agent/mcp) across the workspace.

This repository is actively maintained - Contributions are welcome!

A2A Agent

Architecture:

---
config:
  layout: dagre
---
flowchart TB
 subgraph subGraph0["Agent Capabilities"]
        C["Agent"]
        B["A2A Server - Uvicorn/FastAPI"]
        D["MCP Tools"]
        F["Agent Skills"]
  end
    C --> D & F
    A["User Query"] --> B
    B --> C
    D --> E["Platform API"]

     C:::agent
     B:::server
     A:::server
    classDef server fill:#f9f,stroke:#333
    classDef agent fill:#bbf,stroke:#333,stroke-width:2px
    style B stroke:#000000,fill:#FFD600
    style D stroke:#000000,fill:#BBDEFB
    style F fill:#BBDEFB
    style A fill:#C8E6C9
    style subGraph0 fill:#FFF9C4

Component Interaction Diagram

sequenceDiagram
    participant User
    participant Server as A2A Server
    participant Agent as Agent
    participant Skill as Agent Skills
    participant MCP as MCP Tools

    User->>Server: Send Query
    Server->>Agent: Invoke Agent
    Agent->>Skill: Analyze Skills Available
    Skill->>Agent: Provide Guidance on Next Steps
    Agent->>MCP: Invoke Tool
    MCP-->>Agent: Tool Response Returned
    Agent-->>Agent: Return Results Summarized
    Agent-->>Server: Final Response
    Server-->>User: Output

Usage

CLI

Short Flag Long Flag Description
-h --help See Usage
-b --default-branch Checkout default branch
-c --clone Clone projects specified in workspace file
-p --pull Pull all projects in workspace
-w --workspace Specify the workspace root directory
-f --file Specify the workspace YAML file (Default)
-r --repositories Comma separated Git URLs (Override)
-t --threads Number of parallel threads (Default: 12)
-m --maintain Run phased maintenance workflow
--pre-commit Run parallel pre-commit checks
--bump Bulk version bump (patch, minor, major)
--phase Start maintenance at Phase N (1-5)
--dry-run Preview changes without applying them
--skip-pre-commit Skip pre-commit phase in maintenance
--install [NEW] Bulk install all Python projects
--build [NEW] Bulk build all Python projects
--validate [NEW] Bulk validate all agent/MCP servers
--type Validation filter: agent, mcp, or all
--tree [NEW] Generate ASCII workspace tree
--mermaid [NEW] Generate Mermaid workspace diagram
--setup [NEW] Sync workspace from YAML config
repository-manager \
    --clone  \
    --pull  \
    --workspace '/home/user/Downloads'  \
    --file '/home/user/Downloads/repositories.txt'  \
    --repositories 'https://github.com/Knucklessg1/media-downloader,https://github.com/Knucklessg1/genius-bot' \
    --threads 8

MCP CLI

Short Flag Long Flag Description
-h --help Display help information
-t --transport Transport method: 'stdio', 'http', or 'sse' [legacy] (default: stdio)
-s --host Host address for HTTP transport (default: 0.0.0.0)
-p --port Port number for HTTP transport (default: 8000)
--auth-type Authentication type: 'none', 'static', 'jwt', 'oauth-proxy', 'oidc-proxy', 'remote-oauth' (default: none)
--token-jwks-uri JWKS URI for JWT verification
--token-issuer Issuer for JWT verification
--token-audience Audience for JWT verification
--oauth-upstream-auth-endpoint Upstream authorization endpoint for OAuth Proxy
--oauth-upstream-token-endpoint Upstream token endpoint for OAuth Proxy
--oauth-upstream-client-id Upstream client ID for OAuth Proxy
--oauth-upstream-client-secret Upstream client secret for OAuth Proxy
--oauth-base-url Base URL for OAuth Proxy
--oidc-config-url OIDC configuration URL
--oidc-client-id OIDC client ID
--oidc-client-secret OIDC client secret
--oidc-base-url Base URL for OIDC Proxy
--remote-auth-servers Comma-separated list of authorization servers for Remote OAuth
--remote-base-url Base URL for Remote OAuth
--allowed-client-redirect-uris Comma-separated list of allowed client redirect URIs
--eunomia-type Eunomia authorization type: 'none', 'embedded', 'remote' (default: none)
--eunomia-policy-file Policy file for embedded Eunomia (default: mcp_policies.json)
--eunomia-remote-url URL for remote Eunomia server

A2A CLI

Short Flag Long Flag Description
-h --help Display help information
--host Host to bind the server to (default: 0.0.0.0)
--port Port to bind the server to (default: 9000)
--reload Enable auto-reload
--provider LLM Provider: 'openai', 'anthropic', 'google', 'huggingface'
--model-id LLM Model ID (default: qwen3:4b)
--base-url LLM Base URL (for OpenAI compatible providers)
--api-key LLM API Key
--python-sandbox-enable Enable Python Sandbox MCP configuration
--workspace Workspace to scan for git projects (default: current directory)

Unified Hybrid Graph Intelligence

The Repository Manager natively integrates LadybugDB, NetworkX, and semantic embeddings into a single GraphEngine architecture. This provides deep structural and multimodal intelligence across your Workspace without duplicated repository states.

flowchart TD
    subgraph Data Sources
        YAML[workspace.yml]
        Files[Code / Docs / Images]
    end

    subgraph WorkspaceManager
        Parse[Parse YAML & Groups]
    end

    subgraph GraphEngine
        direction TB
        subgraph GraphConstruction [In-Memory Construction]
            NX[(NetworkX)]
            AST[Tree-sitter AST Pass]
            Semantic[LLM Rationale Pass]
            Leiden[Leiden Clustering]
        end

        subgraph GraphPersistence [Persistence & Storage]
            LB[(LadybugDB .lbug)]
            Sync[Sync / MERGE]
            Vector[Vector Indexes]
        end

        AST --> NX
        Semantic --> NX
        NX --> Leiden
        NX <-->|"get_as_networkx()"| LB
        NX --> Sync
        Sync --> LB
        LB --> Vector
    end

    subgraph MCP Tools
        direction LR
        subgraph GraphIntelligence [Graph Intelligence]
            Impact[graph_impact]
            Search[graph_query]
            Build[graph_build]
            Path[graph_path]
            Status[graph_status]
            Reset[graph_reset]
        end
    end

    YAML --> Parse
    Files --> AST
    Files --> Semantic
    Parse --> Build
    Build --> GraphConstruction
    Impact --> LB
    Search --> LB
    Search --> Vector
    Path --> NX
    Status --> NX
    Reset --> LB
repository-manager --maintain --workspace /path/to/my/projects

This will:

  1. Parse workspace.yml for all repository definitions and dependency groups.
  2. Incrementally parse changed files constructing a NetworkX Graph and sync to LadybugDB.
  3. Expose tools natively to your AI Agent (e.g. graph_impact, graph_query).

Python Sandbox Integration

The Agent can execute Python code in a secure Deno sandbox using mcp-run-python.

repository_manager_a2a --python-sandbox-enable

This will:

  1. Configure mcp_config.json to include the python-sandbox server.
  2. Enable the Python Sandbox skill, allowing the agent to run scripts for calculation, testing, or logic verification.

Default Workspace Model

The manager automatically discovers workspace.yml in the current directory or via the WORKSPACE_YML environment variable. This file serves as the strict single source of truth for the entire environment hierarchy, encompassing repositories, subdirectories, and maintenance policies.

Maintenance Workflows

repository-manager supports specialized maintenance workflows for managing interdependent package ecosystems.

Parallel Pre-commits

Run pre-commit checks across all repositories in parallel. This is significantly faster than sequential runs and simplifies fleet-wide health checks.

repository-manager --pre-commit

Phased Bumping

When packages depend on each other, they often need to be bumped in a specific sequence. The --maintain flag implements this 5-stage process:

  1. Skills: Update core skill packages.
  2. Graphs: Update AI graph/template repositories.
  3. UI: Update frontend components.
  4. Utilities: Update the central utility library (agent-utilities) and propagate skill/graph versions.
  5. Fleet: Propagate the new utility version to all other packages and bump their versions.
# Full maintenance run
repository-manager --maintain

# Skip verify phase (pre-commit) if already done
repository-manager --maintain --skip-pre-commit

# Resume from a specific phase
repository-manager --maintain --phase 4

Using as an MCP Server

The MCP Server can be run in two modes: stdio (for local testing) or http (for networked access). To start the server, use the following commands:

Run in stdio mode (default):

repository-manager-mcp --transport "stdio"

Run in HTTP mode:

repository-manager-mcp --transport "http"  --host "0.0.0.0"  --port "8000"

Use in Python

from repository_manager.repository_manager import Git

gitlab = Git()

gitlab.set_workspace("<workspace>")

gitlab.set_threads(threads=8)

gitlab.set_git_projects("<projects>")

gitlab.set_default_branch(set_to_default_branch=True)

gitlab.clone_projects_in_parallel()

gitlab.pull_projects_in_parallel()

Deploy MCP Server as a Service

The ServiceNow MCP server can be deployed using Docker, with configurable authentication, middleware, and Eunomia authorization.

Using Docker Run

docker pull knucklessg1/repository-manager:latest

docker run -d \
  --name repository-manager-mcp \
  -p 8004:8004 \
  -e HOST=0.0.0.0 \
  -e PORT=8004 \
  -e TRANSPORT=http \
  -e AUTH_TYPE=none \
  -e EUNOMIA_TYPE=none \
  -v development:/root/Development \
  knucklessg1/repository-manager:latest

For advanced authentication (e.g., JWT, OAuth Proxy, OIDC Proxy, Remote OAuth) or Eunomia, add the relevant environment variables:

docker run -d \
  --name repository-manager-mcp \
  -p 8004:8004 \
  -e HOST=0.0.0.0 \
  -e PORT=8004 \
  -e TRANSPORT=http \
  -e AUTH_TYPE=oidc-proxy \
  -e OIDC_CONFIG_URL=https://provider.com/.well-known/openid-configuration \
  -e OIDC_CLIENT_ID=your-client-id \
  -e OIDC_CLIENT_SECRET=your-client-secret \
  -e OIDC_BASE_URL=https://your-server.com \
  -e ALLOWED_CLIENT_REDIRECT_URIS=http://localhost:*,https://*.example.com/* \
  -e EUNOMIA_TYPE=embedded \
  -e EUNOMIA_POLICY_FILE=/app/mcp_policies.json \
  -v development:/root/Development \
  knucklessg1/repository-manager:latest

Using Docker Compose

Create a docker-compose.yml file:

services:
  repository-manager-mcp:
    image: knucklessg1/repository-manager:latest
    environment:
      - HOST=0.0.0.0
      - PORT=8004
      - TRANSPORT=http
      - AUTH_TYPE=none
      - EUNOMIA_TYPE=none
    volumes:
      - development:/root/Development
    ports:
      - 8004:8004

For advanced setups with authentication and Eunomia:

services:
  repository-manager-mcp:
    image: knucklessg1/repository-manager:latest
    environment:
      - HOST=0.0.0.0
      - PORT=8004
      - TRANSPORT=http
      - AUTH_TYPE=oidc-proxy
      - OIDC_CONFIG_URL=https://provider.com/.well-known/openid-configuration
      - OIDC_CLIENT_ID=your-client-id
      - OIDC_CLIENT_SECRET=your-client-secret
      - OIDC_BASE_URL=https://your-server.com
      - ALLOWED_CLIENT_REDIRECT_URIS=http://localhost:*,https://*.example.com/*
      - EUNOMIA_TYPE=embedded
      - EUNOMIA_POLICY_FILE=/app/mcp_policies.json
    ports:
      - 8004:8004
    volumes:
      - development:/root/Development
      - ./mcp_policies.json:/app/mcp_policies.json

Run the service:

docker-compose up -d

Configure mcp.json for AI Integration

{
  "mcpServers": {
    "repository_manager": {
      "command": "uv",
      "args": [
        "run",
        "--with",
        "repository-manager",
        "repository-manager-mcp"
      ],
      "env": {
        "REPOSITORY_MANAGER_WORKSPACE": "/home/user/Development/",                       // Optional - Can be specified at prompt
        "REPOSITORY_MANAGER_THREADS": "12",                                              // Optional - Can be specified at prompt
        "REPOSITORY_MANAGER_DEFAULT_BRANCH": "True",                                     // Optional - Can be specified at prompt
        "REPOSITORY_MANAGER_PROJECTS_FILE": "/home/user/Development/repositories.txt"    // Optional - Can be specified at prompt
      },
      "timeout": 300000
    }
  }
}

A2A

Endpoints

  • Web UI: http://localhost:8000/ (if enabled)
  • A2A: http://localhost:8000/a2a (Discovery: /a2a/.well-known/agent.json)
  • AG-UI: http://localhost:8000/ag-ui (POST)

A2A CLI

Short Flag Long Flag Description
-h --help Display help information
--host Host to bind the server to (default: 0.0.0.0)
--port Port to bind the server to (default: 9000)
--reload Enable auto-reload
--provider LLM Provider: 'openai', 'anthropic', 'google', 'huggingface'
--model-id LLM Model ID (default: qwen3:4b)
--base-url LLM Base URL (for OpenAI compatible providers)
--api-key LLM API Key
--api-key LLM API Key
--mcp-url MCP Server URL (default: http://localhost:8000/mcp)
--web Enable Pydantic AI Web UI False (Env: ENABLE_WEB_UI)

Install Python Package

pip install repository-manager

or

uv pip install --upgrade repository-manager

Repository Owners

GitHub followers GitHub User's stars

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

repository_manager-1.3.53.tar.gz (65.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

repository_manager-1.3.53-py3-none-any.whl (69.6 kB view details)

Uploaded Python 3

File details

Details for the file repository_manager-1.3.53.tar.gz.

File metadata

  • Download URL: repository_manager-1.3.53.tar.gz
  • Upload date:
  • Size: 65.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for repository_manager-1.3.53.tar.gz
Algorithm Hash digest
SHA256 d9401b42f6045def101eaa073f0427e74011363b663a2bb357e3af034d6fa9be
MD5 2f0758b081b56d8050e274eabb0dde02
BLAKE2b-256 1cc57e8c2026039408dcd21b98cba0a97ee5b9c5df555a33af8848cb68001d06

See more details on using hashes here.

File details

Details for the file repository_manager-1.3.53-py3-none-any.whl.

File metadata

File hashes

Hashes for repository_manager-1.3.53-py3-none-any.whl
Algorithm Hash digest
SHA256 9b71c7f0b31965f6c78aab762032bf00544ae8b4b5c7582673aa29403a5579dd
MD5 f2e5d6e7058ebd7c5a5609db6843b937
BLAKE2b-256 ffcedfd9adb8353dbd92317f992829802f15993c843576107795ae98cdb1251f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page