Skip to main content

A Model Context Protocol (MCP) server for FastTransfer, enabling efficient data transfer between database systems.

Project description

FastTransfer MCP Server

A Model Context Protocol (MCP) server that exposes FastTransfer functionality for efficient data transfer between various database systems.

Overview

FastTransfer is a high-performance CLI tool for transferring data between databases. This MCP server wraps FastTransfer functionality and provides:

  • Safety-first approach: Preview commands before execution with user confirmation required
  • Password masking: Credentials and connection strings are never displayed in logs or output
  • Intelligent validation: Parameter validation with database-specific compatibility checks
  • Smart suggestions: Automatic parallelism method recommendations
  • Version detection: Automatic binary version detection with capability registry
  • Comprehensive logging: Full execution logs with timestamps and results

MCP Tools

1. preview_transfer_command

Build and preview a FastTransfer command WITHOUT executing it. Shows the exact command with passwords masked. Always use this first.

2. execute_transfer

Execute a previously previewed command. Requires confirmation: true as a safety mechanism.

3. validate_connection

Validate database connection parameters (parameter check only, does not test actual connectivity).

4. list_supported_combinations

List all supported source-to-target database combinations.

5. suggest_parallelism_method

Recommend the optimal parallelism method based on source database type and table characteristics.

6. get_version

Report the detected FastTransfer binary version, supported types, and feature flags.

Parallelism Methods

Method Best For Requires Key
Ctid PostgreSQL sources No
Rowid Oracle sources No
Physloc SQL Server sources without numeric key No
NZDataSlice Netezza sources No
RangeId Large tables with numeric key Yes
Random Tables with evenly distributed numeric key Yes
DataDriven Any column type, distinct values Yes
Ntile Even distribution across workers Yes
None Small tables or troubleshooting No

Installation

Prerequisites

  • Python 3.10 or higher
  • FastTransfer binary v0.16+ (obtain from Arpe.io)
  • Claude Code or another MCP client

Setup

  1. Clone or download this repository:

    cd /path/to/fasttransfer-mcp
    
  2. Install Python dependencies:

    pip install -r requirements.txt
    
  3. Configure environment:

    cp .env.example .env
    # Edit .env with your FastTransfer path
    
  4. Add to Claude Code configuration (~/.claude.json):

    {
      "mcpServers": {
        "fasttransfer": {
          "type": "stdio",
          "command": "python",
          "args": ["/absolute/path/to/fasttransfer-mcp/src/server.py"],
          "env": {
            "FASTTRANSFER_PATH": "/absolute/path/to/fasttransfer/FastTransfer"
          }
        }
      }
    }
    
  5. Restart Claude Code to load the MCP server.

  6. Verify installation:

    # In Claude Code, run:
    /mcp
    # You should see "fasttransfer: connected"
    

Configuration

Environment Variables

Edit .env to configure:

# Path to FastTransfer binary (required)
FASTTRANSFER_PATH=./fasttransfer/FastTransfer

# Execution timeout in seconds (default: 1800 = 30 minutes)
FASTTRANSFER_TIMEOUT=1800

# Log directory (default: ./logs)
FASTTRANSFER_LOG_DIR=./logs

# Log level (default: INFO)
LOG_LEVEL=INFO

Connection Options

The server supports multiple ways to authenticate and connect:

Parameter Description
server Host:port or host\instance (optional with connect_string or dsn)
user / password Standard credentials
trusted_auth Windows trusted authentication
connect_string Full connection string (excludes server/user/password/dsn)
dsn ODBC DSN name (excludes server/provider)
provider OleDB provider name
file_input File path for data input (source only, excludes query)

Transfer Options

Option CLI Flag Description
method --method Parallelism method
distribute_key_column --distributeKeyColumn Column for data distribution
degree --degree Parallelism degree (0=auto, >0=fixed, <0=CPU adaptive)
load_mode --loadmode Append or Truncate
batch_size --batchsize Batch size for bulk operations
map_method --mapmethod Column mapping: Position or Name
run_id --runid Run ID for logging
data_driven_query --datadrivenquery Custom SQL for DataDriven method
use_work_tables --useworktables Intermediate work tables for CCI
settings_file --settingsfile Custom settings JSON file
log_level --loglevel Override log level (error/warning/information/debug/fatal)
no_banner --nobanner Suppress banner output
license_path --license License file path or URL

Usage Examples

PostgreSQL to SQL Server Transfer

User: "Copy the 'orders' table from PostgreSQL (localhost:5432, database: sales_db,
       schema: public) to SQL Server (localhost:1433, database: warehouse, schema: dbo).
       Use parallel transfer and truncate the target first."

Claude Code will:
1. Call suggest_parallelism_method to recommend Ctid for PostgreSQL
2. Call preview_transfer_command with your parameters
3. Show the command with masked passwords
4. Explain what will happen
5. Ask for confirmation
6. Execute with execute_transfer when you approve

File Import via DuckDB Stream

User: "Import /data/export.parquet into the SQL Server 'staging' table
       using DuckDB stream."

Claude Code will use duckdbstream source type with file_input parameter.

Check Version and Capabilities

User: "What version of FastTransfer is installed?"

Claude Code will call get_version and display the detected version,
supported source/target types, and available features.

Two-Step Safety Process

This server implements a mandatory two-step process:

  1. Preview - Always use preview_transfer_command first
  2. Execute - Use execute_transfer with confirmation: true

You cannot execute without previewing first and confirming.

Security

  • Passwords and connection strings are masked in all output and logs
  • Sensitive flags masked: --sourcepassword, --targetpassword, --sourceconnectstring, --targetconnectstring, -x, -X, -g, -G
  • Use environment variables for sensitive configuration
  • Review commands carefully before executing
  • Use minimum required database permissions

Testing

Run the test suite:

# Run all tests
python -m pytest tests/ -v

# Run with coverage
python -m pytest tests/ --cov=src --cov-report=html

Project Structure

fasttransfer-mcp/
  src/
    __init__.py
    server.py          # MCP server (tool definitions, handlers)
    fasttransfer.py    # Command builder, executor, suggestions
    validators.py      # Pydantic models, enums, validation
    version.py         # Version detection and capabilities registry
  tests/
    __init__.py
    test_command_builder.py
    test_validators.py
    test_version.py
  .env.example
  requirements.txt
  CHANGELOG.md
  README.md

License

This MCP server wrapper is provided as-is. FastTransfer itself is a separate product from Arpe.io.

Related Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fasttransfer_mcp-0.1.0.tar.gz (33.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fasttransfer_mcp-0.1.0-py3-none-any.whl (23.1 kB view details)

Uploaded Python 3

File details

Details for the file fasttransfer_mcp-0.1.0.tar.gz.

File metadata

  • Download URL: fasttransfer_mcp-0.1.0.tar.gz
  • Upload date:
  • Size: 33.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.9

File hashes

Hashes for fasttransfer_mcp-0.1.0.tar.gz
Algorithm Hash digest
SHA256 432d0933b15dfccb6d127584b77cf8ba409c566ac1a557721e98b1b04f24b995
MD5 0cb3521ffd6c98887d4506d7239d8ba0
BLAKE2b-256 b38c23abf0e80f1839576a6dea05c0398269afa8c3c013f5458df738e8dd2fb4

See more details on using hashes here.

File details

Details for the file fasttransfer_mcp-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for fasttransfer_mcp-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 33bfb49b85741c1d6e13d01423a9c0739a1c9ad0a1887f5c69f8da91c5acaea3
MD5 84c0a42f9095718592f1a9481cd01c64
BLAKE2b-256 37ddf58a81a77b3645ca48950361ad6474013648d893374683e2421e0d70a6f1

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page