MCP Tools for AI Cogence - Expose AI capabilities via Model Context Protocol
Project description
AI Cogence MCP Tools
Expose AI Cogence capabilities as MCP (Model Context Protocol) tools that can be used in Claude Desktop and other MCP clients.
What is This?
This package provides 7 AI tools that you can use in MCP-compatible applications:
rag_query- AI-powered Q&A with sourcessemantic_search- Vector similarity searchlist_chat_sessions- Session managementget_session_messages- Message historyget_analytics- Usage metricssearch_knowledge_base- Keyword searchingest_documents- Ingest documents from S3 into vector database
Installation
pip install ai-cogence-tools
Configuration
Create a .env file with your backend credentials:
POSTGRES_USER=your_user
POSTGRES_PASSWORD=your_password
POSTGRES_HOST=your_host
POSTGRES_PORT=5432
POSTGRES_DB=your_db
OPENAI_API_KEY=sk-...
Usage with Claude Desktop
Add to your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json on Mac):
{
"mcpServers": {
"ai-cogence-tools": {
"command": "ai-cogence-tools",
"env": {
"POSTGRES_USER": "your_user",
"POSTGRES_PASSWORD": "your_password",
"POSTGRES_HOST": "your_host",
"POSTGRES_DB": "your_db",
"OPENAI_API_KEY": "sk-..."
}
}
}
}
Restart Claude Desktop. The tools will appear in the tools menu.
Usage Programmatically
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
# Connect to the server
server_params = StdioServerParameters(
command="ai-cogence-tools",
env={
"POSTGRES_USER": "your_user",
"POSTGRES_PASSWORD": "your_password",
# ... other env vars
}
)
async with stdio_client(server_params) as (read, write):
async with ClientSession(read, write) as session:
# Initialize
await session.initialize()
# List available tools
tools = await session.list_tools()
print(f"Available tools: {[tool.name for tool in tools.tools]}")
# Call a tool
result = await session.call_tool("rag_query", {
"question": "What is RAG?"
})
print(result.content)
Available Tools
rag_query
Execute a RAG query to get AI-powered answers with sources.
Arguments:
question(required): The question to asksession_id(optional): Session ID for conversation context
semantic_search
Perform semantic search using vector embeddings.
Arguments:
query(required): Search querytop_k(optional, default=5): Number of results
list_chat_sessions
List all chat sessions with metadata.
Arguments:
limit(optional, default=20): Maximum sessions to return
get_session_messages
Get all messages for a specific session.
Arguments:
session_id(required): Session ID
get_analytics
Get usage analytics and metrics.
Arguments:
time_range(optional, default="today"): Time range (today, week, month, all)
search_knowledge_base
Search the knowledge base using keywords.
Arguments:
query(required): Search querylimit(optional, default=10): Maximum results
ingest_documents
Ingest documents from S3 bucket into the vector database. Loads documents, chunks them, creates embeddings, and stores them for RAG queries.
Arguments:
force_refresh(optional, default=false): Force refresh even if documents are already ingested
What it does:
- Loads documents from S3 bucket
- Chunks documents for optimal retrieval
- Creates vector embeddings
- Stores in PostgreSQL with pgvector
- Syncs with existing vectors (adds new, removes obsolete)
How It Works
This MCP server connects to the AI Cogence backend and exposes its capabilities as tools. It:
- Uses existing backend services (no code duplication)
- Connects to your PostgreSQL database with pgvector
- Uses OpenAI for embeddings and completions
- Provides RAG, search, and analytics capabilities
Requirements
- Python 3.10+
- PostgreSQL with pgvector extension
- OpenAI API key
- Access to AI Cogence backend database
License
MIT
Support
For issues or questions, visit: https://github.com/ai-cogence/mcp-tools/issues
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file ai_cogence_tools-1.1.0.tar.gz.
File metadata
- Download URL: ai_cogence_tools-1.1.0.tar.gz
- Upload date:
- Size: 15.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a331fa3f121788377ff0a9c1f8ff6ec75effe952156629b322cb12267e90716a
|
|
| MD5 |
bd2c81c6a9ac2fcb9324e5efbbb02a8b
|
|
| BLAKE2b-256 |
798127b74071ccf0d37a3d20403db6f86b108272f308fd774be4521c7b9550da
|
File details
Details for the file ai_cogence_tools-1.1.0-py3-none-any.whl.
File metadata
- Download URL: ai_cogence_tools-1.1.0-py3-none-any.whl
- Upload date:
- Size: 12.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ca93c8484e5c04068f07f99d339776fde07dae7454011fa5042f4f8f5abc4355
|
|
| MD5 |
bc0de638251549b47a88bd276bda7268
|
|
| BLAKE2b-256 |
70b3d32b195cc264b8201eb7affa94d0655727471fd719b9c15bb826ec9fe86e
|