Skip to main content

An mcp.run client for Python

Project description

mcpx-py

PyPI

A Python library for interacting with LLMs using mcp.run tools

Features

AI Provider Support

mcpx-py supports all models supported by PydanticAI

Dependencies

  • uv
  • npm
  • ollama (optional)

mcp.run Setup

You will need to get an mcp.run session ID by running:

npx --yes -p @dylibso/mcpx gen-session --write

This will generate a new session and write the session ID to a configuration file that can be used by mcpx-py.

If you need to store the session ID in an environment variable you can run gen-session without the --write flag:

npx --yes -p @dylibso/mcpx gen-session

which should output something like:

Login successful!
Session: kabA7w6qH58H7kKOQ5su4v3bX_CeFn4k.Y4l/s/9dQwkjv9r8t/xZFjsn2fkLzf+tkve89P1vKhQ

Then set the MPC_RUN_SESSION_ID environment variable:

$ export MCP_RUN_SESSION_ID=kabA7w6qH58H7kKOQ5su4v3bX_CeFn4k.Y4l/s/9dQwkjv9r8t/xZFjsn2fkLzf+tkve89P1vKhQ

Python Usage

Installation

Using uv:

uv add mcpx-py

Or pip:

pip install mcpx-py

Example code

from mcpx_py import Chat

llm = Chat("claude-3-5-sonnet-latest")

# Or OpenAI
# llm = Chat("gpt-4o")

# Or Ollama
# llm = Chat("ollama:qwen2.5")

# Or Gemini
# llm = Chat("gemini-2.0-flash")

response = llm.send_message_sync(
    "summarize the contents of example.com"
)
print(response.data)

It's also possible to get structured output by setting result_type

from mcpx_py import Chat, BaseModel, Field
from typing import List

class Summary(BaseModel):
    """
    A summary of some longer text
    """
    source: str = Field("The source of the original_text")
    original_text: str = Field("The original text to be summarized")
    items: List[str] = Field("A list of summary points")

llm = Chat("claude-3-5-sonnet-latest", result_type=Summary)
response = llm.send_message_sync(
    "summarize the contents of example.com"
)
print(response.data)

More examples can be found in the examples/ directory

Command Line Usage

Installation

uv tool install mcpx-py

From git:

uv tool install git+https://github.com/dylibso/mcpx-py

Or from the root of the repo:

uv tool install .

uvx

mcpx-client can also be executed without being installed using uvx:

uvx --from git+https://github.com/dylibso/mcpx-py mcpx-client

Running

Get usage/help

mcpx-client --help

Chat with an LLM

mcpx-client chat

List tools

mcpx-client list

Call a tool

mcpx-client tool eval-js '{"code": "2+2"}'

LLM Configuration

Provider Setup

Claude
  1. Sign up for an Anthropic API account at https://console.anthropic.com
  2. Get your API key from the console
  3. Set the environment variable: ANTHROPIC_API_KEY=your_key_here
OpenAI
  1. Create an OpenAI account at https://platform.openai.com
  2. Generate an API key in your account settings
  3. Set the environment variable: OPENAI_API_KEY=your_key_here
Gemini
  1. Create an Gemini account at https://aistudio.google.com
  2. Generate an API key in your account settings
  3. Set the environment variable: GEMINI_API_KEY=your_key_here
Ollama
  1. Install Ollama from https://ollama.ai
  2. Pull your desired model: ollama pull llama3.2
  3. No API key needed - runs locally
Llamafile
  1. Download a Llamafile model from https://github.com/Mozilla-Ocho/llamafile/releases
  2. Make the file executable: chmod +x your-model.llamafile
  3. Run in JSON API mode: ./your-model.llamafile --json-api --host 127.0.0.1 --port 8080
  4. Use with the OpenAI provider pointing to http://localhost:8080

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcpx_py-0.4.2.tar.gz (9.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mcpx_py-0.4.2-py3-none-any.whl (8.1 kB view details)

Uploaded Python 3

File details

Details for the file mcpx_py-0.4.2.tar.gz.

File metadata

  • Download URL: mcpx_py-0.4.2.tar.gz
  • Upload date:
  • Size: 9.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.6.8

File hashes

Hashes for mcpx_py-0.4.2.tar.gz
Algorithm Hash digest
SHA256 167ca1ba20257564cfcba5df5ab2bcc849b8123f9298320e64e440995abfbad6
MD5 0ca161ff5d597dcfa7f2b36a83e54416
BLAKE2b-256 efc99f475990e6ec9feba49b9d4ecd9a93ccc5ad62e31ee7fab9fc5ec6de0d8b

See more details on using hashes here.

File details

Details for the file mcpx_py-0.4.2-py3-none-any.whl.

File metadata

  • Download URL: mcpx_py-0.4.2-py3-none-any.whl
  • Upload date:
  • Size: 8.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.6.8

File hashes

Hashes for mcpx_py-0.4.2-py3-none-any.whl
Algorithm Hash digest
SHA256 b05e871e4a3a1b3e056b4c679914e3e4bed0f1210621fd0528b5dabf1a5d9d68
MD5 e3cacc8f9c7449c49530f2f65695ba82
BLAKE2b-256 3f84d0258b439856243f9cbe858fc5442a4dc31574e4ce4e88c7ff97694f7661

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page