MCP server for z/OS mainframe FTP operations
Project description
z/OS FTP MCP Server
A Model Context Protocol (MCP) server for interacting with z/OS mainframe systems via FTP.
Features
- List datasets from mainframe catalog with pattern matching
- Download datasets in binary and text formats with encoding conversion
- Download members from partitioned datasets (PDS) with parallel support
- Submit and monitor JCL jobs
- Advanced text processing: encoding conversion, line ending control, trailing spaces handling
- Configurable write protection for safe production use
- Connection management via environment variables
Installation
For MCP Server Usage
No installation needed! The MCP configuration with uvx will automatically download and run the package.
For Python Library Usage
pip install zos-ftp-mcp
Configuration
Set environment variables for connection:
export ZFTP_HOST="your-mainframe-host"
export ZFTP_PORT="21"
export ZFTP_USER="your-username"
export ZFTP_PASSWORD="your-password"
export ZFTP_TIMEOUT="600.0"
export ZFTP_DOWNLOAD_PATH="/path/to/downloads"
export ZFTP_DEBUG="false" # Set to "true" for detailed FTP protocol logging
export ZFTP_ALLOW_WRITE="false" # Set to "true" to enable upload/submit operations
Advanced Text Processing
Control how text files are downloaded from EBCDIC mainframe to ASCII/UTF-8:
# Character encoding conversion (EBCDIC to ASCII/UTF-8)
export ZFTP_DEFAULT_ENCODING="IBM-037,UTF-8" # US/Canada EBCDIC to UTF-8
# Other options: IBM-1047,UTF-8 (Latin-1), IBM-285,UTF-8 (UK)
# Line ending format
export ZFTP_DEFAULT_LINE_ENDING="LF" # Unix/Linux style
# Options: CRLF (Windows), LF (Unix), CR (old Mac), NONE
# Preserve trailing spaces in fixed-length records
export ZFTP_PRESERVE_TRAILING_SPACES="false" # Strip trailing spaces
# Set to "true" to keep 80-character fixed-length records intact
When to use these settings:
- Encoding: Essential for proper character conversion (JCL, COBOL, logs)
- Line endings: Cross-platform compatibility, version control (Git)
- Trailing spaces: Keep "false" for most text files, "true" for fixed-format data
Write Operations
By default, the MCP server is read-only for safety. To enable write operations:
export ZFTP_ALLOW_WRITE="true"
When enabled, you can:
- Upload files to mainframe datasets (
upload_dataset) - Submit JCL jobs (
submit_job)
Use with caution in production environments. Read-only operations (list, download, monitor jobs) are always available.
Usage
As MCP Server
{
"mcpServers": {
"zos-ftp-mcp": {
"command": "uvx",
"args": ["zos-ftp-mcp"],
"env": {
"ZFTP_HOST": "your-mainframe-host",
"ZFTP_USER": "your-username",
"ZFTP_PASSWORD": "your-password",
"ZFTP_DOWNLOAD_PATH": "/path/to/downloads"
}
}
}
}
Direct Command Line
zos-ftp-mcp
Tools Available
Dataset Operations
list_catalog(pattern, limit, offset)- List datasets matching pattern with paginationdownload_binary(source_dataset, target_file)- Download dataset in binary modedownload_text(source_dataset, target_file, encoding, line_ending, preserve_trailing_spaces)- Download with text processingdownload_pds_members(dataset, target_dir, members, encoding, line_ending, ...)- Download PDS members with advanced optionsupload_dataset(source_file, target_dataset, binary, lrecl, blksize, recfm, space)- Upload file to dataset (requires ZFTP_ALLOW_WRITE=true)get_vsam_info(dataset, limit, offset)- Get VSAM dataset information with pagination (requires ZFTP_ALLOW_WRITE=true)get_gdg_info(gdg_base, limit, offset)- Get GDG base attributes and generations with pagination (requires ZFTP_ALLOW_WRITE=true)
Job Operations
submit_job(jcl, jobname)- Submit JCL job (requires ZFTP_ALLOW_WRITE=true)list_jes_jobs(jobmask, owner, status, limit, offset)- List jobs with filtering and paginationget_job_info(jobid)- Get job detailsdownload_job_spool(jobid, target_file)- Download job output with return code extraction
Connection Management
get_connection_info()- Show current connection settings
Sample Usage Prompts
Once configured, you can use these prompts:
Dataset Discovery
"List all datasets starting with SYS1 to explore system datasets"
"Show me all user datasets matching MYUSER.* pattern"
Data Download
"Download the dataset MYUSER.COBOL.SOURCE to my local downloads folder with UTF-8 encoding"
"Download all members from the PDS MYUSER.COBOL.COPYLIB to a local directory"
Data Upload (requires ZFTP_ALLOW_WRITE=true)
"Upload my local file test.jcl to dataset MYUSER.JCL.TEST with lrecl=80 and recfm=FB"
"Upload local file member.cbl to PDS member MYUSER.COBOL.SOURCE(TESTPGM)"
Job Operations
"Submit this JCL job and wait for completion" (requires ZFTP_ALLOW_WRITE=true)
"List all my jobs that are in OUTPUT status"
"Download the spool output for job JOB12345 and show me the return code"
Connection Management
"Show me the current FTP connection settings"
Dependencies
- mcp - Model Context Protocol
- Python 3.10+ standard library (ftplib, pathlib, etc.)
Advanced Features
Pagination Support
To avoid overwhelming context windows with large result sets, several tools support pagination:
Tools with pagination:
list_catalog- Paginate through datasetslist_jes_jobs- Paginate through jobsget_vsam_info- Paginate through VSAM datasetsget_gdg_info- Paginate through GDG generations
Usage:
# Get first 50 datasets
list_catalog('USER.*', limit=50)
# Get next 50 datasets
list_catalog('USER.*', limit=50, offset=50)
# Get first 20 jobs
list_jes_jobs(status='OUTPUT', limit=20)
# Get first 10 GDG generations
get_gdg_info('AWS.M2.CARDDEMO.TRANSACT.BKUP', limit=10)
Response includes:
count- Number of items returned in this pagetotal- Total number of items availableoffset- Current offsethas_more- True if more items are available
JESINTERFACELEVEL Limitations
The server automatically detects your mainframe's JES interface level. With JESINTERFACELEVEL=1 (common in older systems):
Job Submission:
- Jobname in JCL must be
userid + exactly one character - Example: If userid is
USER123, jobname must beUSER123J - The server validates this automatically and provides clear error messages
Job Listing:
- Server-side filtering may not work
- Client-side filtering is applied automatically
- Jobs may be purged quickly from spool
Workaround: The server handles these limitations transparently. Just be aware that job retrieval may fail if jobs are purged quickly.
Parallel PDS Downloads
Download PDS members faster using multiple FTP connections:
# Sequential (default)
download_pds_members(dataset, target_dir, members, ftp_threads=1)
# Parallel (2-16 threads)
download_pds_members(dataset, target_dir, members, ftp_threads=4)
Parallel downloads are significantly faster for large PDSs (>10 members).
Per-Operation Overrides
While defaults are set via environment variables, you can override them per operation:
# Use different line ending for specific file
download_text(dataset, target_file, line_ending='CRLF')
# Preserve trailing spaces for fixed-format data
download_text(dataset, target_file, preserve_trailing_spaces=True)
# Use different encoding
download_pds_members(dataset, target_dir, members, encoding='IBM-1047,UTF-8')
License
MIT License - see LICENSE file for details.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file zos_ftp_mcp-0.1.1.tar.gz.
File metadata
- Download URL: zos_ftp_mcp-0.1.1.tar.gz
- Upload date:
- Size: 35.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
36cfee1dd675e5cd365d2f0a2611e6856dbb6ab51341fc52255724835b649b1e
|
|
| MD5 |
906b5ae5c794047742662c429e9c6675
|
|
| BLAKE2b-256 |
619e97d1980cd4dc7b62776ded9d44a99b8fbe384ac2a0ab9a4a5ede97d62ab8
|
File details
Details for the file zos_ftp_mcp-0.1.1-py3-none-any.whl.
File metadata
- Download URL: zos_ftp_mcp-0.1.1-py3-none-any.whl
- Upload date:
- Size: 34.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: uv/0.8.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cb5e6678ecdb7765a522c4df38be5f65f89febc89e92dd7fe1a37361dc76489a
|
|
| MD5 |
4887e9293974445c6339c1b28e5a6607
|
|
| BLAKE2b-256 |
90987d43793e67559193253dda549c6e08078656932d1510447a3d55b96db428
|