Add your description here
Project description
pg_ai 
Overview
pg_ai is an open-source MCP (Model Context Protocol) server tailored for PostgreSQL databases. It bridges LLMs with your Postgres data by allowing users to define custom "skills" — encapsulated business logic in markdown files — that guide intelligent database interactions. By implementing the SKILL Graph technique, pg_ai enables dynamic, on-demand context expansion, preventing LLM overload while scaling complexity through interconnected skills.
Inspired by Anthropic's Claude Skills and the SKILL Graph concept (as discussed in this X thread), pg_ai turns your database into a contextual powerhouse for AI agents.
Key Features
- Postgres Integration: Asynchronous connections for efficient querying.
- Skill Management: Load custom skills from markdown files (SKILL.md) in a progressive, token-efficient manner.
- Dynamic Context Growth: Use SKILL Graph to link and load skills on-demand, building a network of reusable logic.
- MCP Compliance: Exposes tools for LLMs to read business logic, load specific skills, and execute SQL queries.
- Logging and Configurability: Environment-based setup with dedicated logging.
Architecture
pg_ai is built on FastMCP (a FastAPI-based MCP implementation) and uses asyncpg for PostgreSQL interactions. The core components include:
- Server Setup (app.py): Initializes the MCP server with a lifespan hook for database connection management.
- PgMCP Class (src/mcp_server/server.py): Wraps FastMCP to configure the server with tools and lifespan.
- Postgres Connector (src/connectors/pg_connector.py): Handles async connect/disconnect to Postgres using the
databaseslibrary. - Environment Loader (src/loaders/env_loader.py): Loads settings from
.envusing Pydantic for validation. - Logger (src/logger/mcp_logger.py): Configures file-based logging for server events.
- Tools (src/mcp_tools/tools.py): Defines MCP tools:
read_business_logic(): Loads default business logic frompg_skills/business-logic/SKILL.md.load_skill(skill_name): Dynamically loads a specific skill's instructions frompg_skills/<skill_name>/SKILL.md.execute_query(sql_query): Executes SQL queries on the connected Postgres DB and returns results as a Polars DataFrame.
The SKILL Graph is realized through interconnected skills: Each SKILL.md can reference other skills, allowing the LLM to traverse the graph by calling load_skill as needed. This grows context incrementally, aligning with MCP's goal of efficient external system integration.
Installation
Prerequisites
- Python >= 3.12
- PostgreSQL database
- Git
Steps
-
Clone the repository:
git clone https://github.com/saiprasaad2002/pg_ai.git cd pg_ai -
Create virtual environment (preferably python 3.12):
uv venv --python 3.12 -
Install dependencies:
uv sync -
Copy and configure the environment file:
cp .env.example .envEdit
.envwith your Postgres credentials and MCP server settings:DB_USER,DB_PASS,DB_HOST,DB_PORT,DB_NAME: Postgres connection details.MCP_SERVER_HOST,MCP_SERVER_PORT,MCP_SERVER_TRANSPORT: Server config (e.g.,STREAMABLE-HTTPfor production).
-
Run the server:
uv run app.py
The server will start, connect to your Postgres DB, and expose MCP endpoints.
Usage
Adding Skills
Skills are stored in the pg_skills/ directory. Each skill is a subfolder containing a SKILL.md file.
-
Structure Example:
pg_skills/ ├── business-logic/ │ └── SKILL.md # Default business logic (e.g., table schemas, query guidelines) └── custom-skill/ └── SKILL.md # Custom skill instructions (YAML frontmatter + Markdown body) -
SKILL.md Format (Inspired by Claude Skills):
- YAML Frontmatter: Minimal metadata (name, description) for progressive disclosure.
- Body: Detailed instructions, examples, or business logic for the LLM.
- Example:
--- name: inventory-check description: Checks inventory levels in the products table. Use when querying stock. --- # Inventory Check Skill To check inventory: 1. Query the `products` table: SELECT * FROM products WHERE id = {id}; 2. Analyze stock levels. ...
Skills can reference others (e.g., "Load the 'reporting' skill for summaries"), forming a graph for dynamic loading.
Interacting with the Server
- Connect your LLM (e.g., Claude, ChatGPT) via an MCP client.
- The LLM can call tools to load skills and execute queries, building context as needed.
- Logs are saved in
mcp_logs/pg_ai_log.log.
Contributing
Contributions are welcome! Please follow these steps:
- Fork the repository.
- Create a feature branch:
git checkout -b feature/new-feature. - Commit changes:
git commit -m 'Add new feature'. - Push to the branch:
git push origin feature/new-feature. - Open a Pull Request.
License
This project is licensed under the MIT License - see the LICENSE file for details.
References
For questions, open an issue or contact the maintainer.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file postgres_ai-0.1.0.tar.gz.
File metadata
- Download URL: postgres_ai-0.1.0.tar.gz
- Upload date:
- Size: 10.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.13 {"installer":{"name":"uv","version":"0.9.13"},"python":null,"implementation":{"name":null,"version":null},"distro":null,"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9dc84d7c64b71321552fc2c1780f7ee10e17201a7f668367643aa760d158d14d
|
|
| MD5 |
36dec7236f6606d6e9a1724287e8bf5d
|
|
| BLAKE2b-256 |
82082126e1333216f4f1dd7b0df719bc3d997e27dde12194ffa7e04198c43d15
|
File details
Details for the file postgres_ai-0.1.0-py3-none-any.whl.
File metadata
- Download URL: postgres_ai-0.1.0-py3-none-any.whl
- Upload date:
- Size: 8.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.9.13 {"installer":{"name":"uv","version":"0.9.13"},"python":null,"implementation":{"name":null,"version":null},"distro":null,"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2bfeb5bbec5ecc35868db0543d96d8018051c19d08615cd0f24834e488398e27
|
|
| MD5 |
a1442620f465fc7eb9a8274e6336230c
|
|
| BLAKE2b-256 |
1eb61c3dffbd21ca21898e1758d3227d0b52bfc9f3ef8cf3bf3f48a1bf56f550
|