A Python client for temporal reasoning and learning using Model Context Protocol (MCP) servers
Project description
Temprl MCP Client
A flexible Python library and CLI tool for interacting with Model Context Protocol (MCP) servers using any LLM model.
Overview
Temprl MCP Client is both a Python library and a command-line tool that allows you to query and interact with MCP servers through natural language. It connects to any number of configured MCP servers, makes their tools available to language models (OpenAI, Anthropic, Ollama, LMStudio), and provides a conversational interface for accessing and manipulating data from these servers.
The project demonstrates how to:
- Connect to multiple MCP servers simultaneously
- List and call tools provided by these servers
- Use function calling capabilities to interact with external data sources
- Process and present results in a user-friendly way
- Create a reusable Python library with a clean API
- Build a command-line interface on top of the library
Features
- Multiple Provider Support: Works with OpenAI, Anthropic, Ollama, and LMStudio models
- Modular Architecture: Clean separation of concerns with provider-specific modules
- Dual Interface: Use as a Python library or command-line tool
- MCP Server Integration: Connect to any number of MCP servers simultaneously
- Tool Discovery: Automatically discover and use tools provided by MCP servers
- Flexible Configuration: Configure models and servers through JSON configuration
- Environment Variable Support: Securely store API keys in environment variables
- Comprehensive Documentation: Detailed usage examples and API documentation
- Installable Package: Easy installation via pip with
temprl-mcp-clientcommand
Prerequisites
Before installing Temprl MCP Client, ensure you have the following prerequisites installed:
- Python 3.8+
- SQLite - A lightweight database used by the demo
- uv/uvx - A fast Python package installer and resolver
Setting up Prerequisites
Windows
-
Python 3.8+:
- Download and install from python.org
- Ensure you check "Add Python to PATH" during installation
-
SQLite:
- Download the precompiled binaries from SQLite website
- Choose the "Precompiled Binaries for Windows" section and download the sqlite-tools zip file
- Extract the files to a folder (e.g.,
C:\sqlite) - Add this folder to your PATH:
- Open Control Panel > System > Advanced System Settings > Environment Variables
- Edit the PATH variable and add the path to your SQLite folder
- Verify installation by opening Command Prompt and typing
sqlite3 --version
-
uv/uvx:
- Open PowerShell as Administrator and run:
powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex" - Restart your terminal and verify installation with
uv --version
- Open PowerShell as Administrator and run:
macOS
-
Python 3.8+:
- Install using Homebrew:
brew install python
- Install using Homebrew:
-
SQLite:
- SQLite comes pre-installed on macOS, but you can update it using Homebrew:
brew install sqlite - Verify installation with
sqlite3 --version
- SQLite comes pre-installed on macOS, but you can update it using Homebrew:
-
uv/uvx:
- Install using Homebrew:
brew install uv - Or use the official installer:
curl -LsSf https://astral.sh/uv/install.sh | sh - Verify installation with
uv --version
- Install using Homebrew:
Linux (Ubuntu/Debian)
-
Python 3.8+:
sudo apt update sudo apt install python3 python3-pip -
SQLite:
sudo apt update sudo apt install sqlite3- Verify installation with
sqlite3 --version
- Verify installation with
-
uv/uvx:
curl -LsSf https://astral.sh/uv/install.sh | sh- Verify installation with
uv --version
- Verify installation with
Installation
Option 1: Install from PyPI (Recommended)
pip install temprl-mcp-client
Configuration
The project uses two main configuration files:
-
.env- Contains OpenAI API configuration:OPENAI_API_KEY=your_openai_api_key_here OPENAI_MODEL=gpt-4o # OPENAI_BASE_URL=https://api.openai.com/v1 # Uncomment and modify if using a custom base url -
mcp_config.json- Defines MCP servers to connect to:{ "mcpServers": { "server1": { "command": "command-to-start-server", "args": ["arg1", "arg2"], "env": { "ENV_VAR1": "value1", "ENV_VAR2": "value2" } }, "server2": { "command": "another-server-command", "args": ["--option", "value"] } } }
You can add as many MCP servers as you need, and the client will connect to all of them and make their tools available.
Usage
The Temprl MCP client now supports storing chat memories in a PostgreSQL database, which enables:
- Persistent storage of conversations across sessions
- Loading past conversations by ID
- Continuing conversations from where they left off
Setting up PostgreSQL
- Install PostgreSQL if you don't have it already
- Create a database for the chat memory system
- Configure the connection in your
.envfile:
# PostgreSQL Configuration
POSTGRES_HOST=localhost
POSTGRES_PORT=5432
POSTGRES_DB=temprl_mcp
POSTGRES_USER=postgres
POSTGRES_PASSWORD=your_password
- Run the setup script to initialize the database:
python setup_postgres.py
Using Chat Memory
Starting a New Chat
When you start a conversation, the system automatically generates a unique ID for the chat:
from temprl_mcp_client.client import initialize_mcp, run_interaction
# Initialize MCP with a new chat memory
mcp_manager = await initialize_mcp()
# Get the chat ID for later use
chat_id = mcp_manager.chat_memory.chat_id
print(f"New chat created with ID: {chat_id}")
# Run interactions
response = await run_interaction(
user_query="Your question here",
mcp_manager=mcp_manager
)
Loading a Chat by ID
To continue a previous conversation, use the chat ID:
from temprl_mcp_client.client import initialize_mcp, run_interaction
# Load an existing chat by ID
chat_id = "your-previous-chat-id"
mcp_manager = await initialize_mcp(chat_id=chat_id)
# Check if the chat was found
if mcp_manager.chat_memory.is_new:
print(f"No chat found with ID: {chat_id}")
else:
print(f"Loaded chat: {mcp_manager.chat_memory.title}")
# Continue the conversation
response = await run_interaction(
user_query="Your next question",
mcp_manager=mcp_manager
)
Using the Chat ID Tool
The client comes with a convenient tool for managing chats by ID:
# List all available chats
python chat_by_id.py --list
# Start a new chat
python chat_by_id.py
# Load a chat by ID
python chat_by_id.py --id your-chat-id
# Load a chat and immediately send a query
python chat_by_id.py --id your-chat-id --query "Your question here"
API Reference
The chat memory system provides the following features:
ChatMemory(chat_id=None)- Create or load a chat memorychat_memory.chat_id- Get the unique ID for the current chatChatMemory.list_conversations()- List all available chatsChatMemory.delete_conversation(chat_id)- Delete a chat by ID
The system automatically persists all messages to the database as you chat, so there's no need to manually save the state.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file temprl_mcp_client-0.1.1.tar.gz.
File metadata
- Download URL: temprl_mcp_client-0.1.1.tar.gz
- Upload date:
- Size: 33.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
00d613343d942216b29268b670e17dc6bd63e0a3d145ae89d43883ae02c7280e
|
|
| MD5 |
e1eed9418bdd9cc43a930a4fbab14ba2
|
|
| BLAKE2b-256 |
e9c3a5524efe13b322885a79d6ede2de659cc0b52d25808c1b6300115dd35bb0
|
File details
Details for the file temprl_mcp_client-0.1.1-py3-none-any.whl.
File metadata
- Download URL: temprl_mcp_client-0.1.1-py3-none-any.whl
- Upload date:
- Size: 33.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
78d48665e5471aa990999ee18c11350a7c4435c89ebcc3f73f848747e9185309
|
|
| MD5 |
4ca029ceac267d18c3400c7358923fa4
|
|
| BLAKE2b-256 |
3cce0b56e919e8c2d0391b3b290c5f91ccf94d8cd71a51f1203c3be8a602bbf4
|