Skip to main content

Add your description here

Project description

MCP-Langgraph Integration Tutorial

This tutorial demonstrates how to integrate Model Context Protocol (MCP) servers with Langgraph agents to create powerful, tool-enabled AI applications. The project showcases a data science assistant named Scout that can help users manage their data science projects using various MCP-powered tools.

Overview

The project implements a conversational AI agent that:

  • Uses GPT-4.1 as the base model
  • Integrates with multiple MCP servers for different functionalities
  • Uses Langgraph for orchestrating the conversation flow
  • Provides a streaming interface for real-time responses

Prerequisites

  • Python 3.13+
  • Node.js (for filesystem MCP server)
  • Docker (for GitHub MCP server)
  • UV package manager
  • OpenAI API key

Project Structure

scout/
├── graph.py           # Langgraph agent implementation
├── client.py          # MCP client and streaming interface
├── client_utils.py    # Utility functions
├── main.py           # Entry point
└── my_mcp/           # MCP server configurations
    ├── config.py     # Config loading and env var resolution
    ├── mcp_config.json # MCP server definitions
    └── local_servers/ # Custom MCP server implementations

Setup

  1. Clone the repository:
git clone <repository-url>
cd mcp-intro
  1. Create and activate a virtual environment:
python -m venv .venv
source .venv/bin/activate  # On Windows: .venv\Scripts\activate
  1. Install dependencies:
uv pip install -e .
  1. Set up environment variables: Create a .env file with:
OPENAI_API_KEY=your_openai_api_key
MCP_FILESYSTEM_DIR=/path/to/projects/directory
MCP_GITHUB_PAT=your_github_personal_access_token

MCP Servers

This project integrates with four MCP servers:

  1. Dataflow Server: Custom implementation for data loading and querying
  2. Filesystem Server: Uses @modelcontextprotocol/server-filesystem for file operations
  3. Git Server: Uses mcp-server-git for local git operations
  4. GitHub Server: Uses the official GitHub MCP server for GitHub operations

Usage

  1. Start the application:
python -m scout.client
  1. Interact with Scout by typing your questions or requests. For example:
USER: Can you help me set up a new data science project?
  1. Scout will use its tools to:
  • Create and manage project directories
  • Handle data loading and transformation
  • Manage version control
  • Interact with GitHub repositories
  1. Type 'quit' or 'exit' to end the session.

How It Works

  1. The graph.py file defines the Langgraph agent structure:
  • Sets up the system prompt and agent state
  • Configures the LLM (GPT-4)
  • Defines the conversation flow graph
  1. The client.py file:
  • Initializes the MCP client with multiple servers
  • Handles streaming responses
  • Manages the interactive session
  1. MCP servers provide tools for:
  • File system operations
  • Data manipulation
  • Git operations
  • GitHub interactions

Extending the Project

You can extend this project by:

  1. Adding new MCP servers in my_mcp/local_servers/
  2. Modifying the system prompt in graph.py
  3. Adding new tools to the agent
  4. Customizing the conversation flow

Contributing

Contributions are welcome! Please feel free to submit a Pull Request.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

iflow_mcp_kenneth_liao_mcp_intro-0.1.0.tar.gz (8.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

File details

Details for the file iflow_mcp_kenneth_liao_mcp_intro-0.1.0.tar.gz.

File metadata

  • Download URL: iflow_mcp_kenneth_liao_mcp_intro-0.1.0.tar.gz
  • Upload date:
  • Size: 8.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.0 {"installer":{"name":"uv","version":"0.10.0","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_kenneth_liao_mcp_intro-0.1.0.tar.gz
Algorithm Hash digest
SHA256 bfc4c6cc16d597d000c908abb857c2a915e17c2fc03a038c6059a1e1e227fa35
MD5 3b379e3c6215089a779666d65c2f34ec
BLAKE2b-256 863de7f6f793c7ff6d17a1587518f9a5818e71ff35dc7316577bd5fb3734b304

See more details on using hashes here.

File details

Details for the file iflow_mcp_kenneth_liao_mcp_intro-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: iflow_mcp_kenneth_liao_mcp_intro-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 10.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.0 {"installer":{"name":"uv","version":"0.10.0","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Debian GNU/Linux","version":"13","id":"trixie","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for iflow_mcp_kenneth_liao_mcp_intro-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3bd3fab79d37fa0e81a85178d64b4d73a71d643f476675331dd528e6a99c53c1
MD5 4dc3b34f0951456460f9513f8e7e59c2
BLAKE2b-256 ac56520cb5db02f629751f56b41e36dec2d3b768ed710130ba5908253f3132e0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page