Skip to main content

MCP server for Ireland's NTA public transport — focused, queryable tools with human-readable output. Inspired by ireland-nta-mcp.

Project description

mcp-nta

An MCP server for Ireland's National Transport Authority (NTA) public transport data. Provides focused, queryable tools that return small, human-readable results — not raw feed dumps.

Inspired by ireland-nta-mcp and tfi-gtfs.

Tools

Tool Description
search_stops Find stops by name. Returns IDs, locations, routes served.
search_routes Find routes by number or name.
get_stop_departures Real-time departures from a stop, filtered by route/time window.
get_vehicle_positions Live vehicle positions filtered by route or proximity.
get_service_alerts Active service alerts filtered by route or stop.
get_route_stops Ordered list of stops on a route.

Installation

# Using uv
uvx mcp-nta

# Or install with pip
pip install mcp-nta
python -m mcp_nta

Configuration

Environment variables

Variable Required Default Description
NTA_API_KEY Yes API key from the NTA developer portal
NTA_ROUTES No (all routes) Comma-separated whitelist of routes to index (see Route filtering)
NTA_REFRESH_HOURS No 24 How often to re-download GTFS data (in hours)

Claude Desktop / Claude Code

{
  "mcpServers": {
    "nta": {
      "command": "uvx",
      "args": ["mcp-nta"],
      "env": {
        "NTA_API_KEY": "YOUR_KEY",
        "NTA_ROUTES": "37,39a,DART,Green"
      }
    }
  }
}

Route filtering

By default, the server indexes every route in the NTA GTFS feed — over 1,000 routes, ~7 million schedule entries. This works fine but produces a ~400 MB cache and takes a couple of minutes on first start.

Setting NTA_ROUTES restricts the index to only the routes you care about. This dramatically reduces build time, disk usage, and memory:

No filter 37,39a,DART
Cache size ~400 MB ~6 MB
First build ~2 min ~1 min
Memory (build) ~120 MB ~40 MB

Each entry in NTA_ROUTES is matched case-insensitively against either:

  • Route short name — the public-facing route number/name (e.g. 37, 39a, 102, Green, DART)
  • Route typebus, rail, or tram to include all routes of that type

Examples:

# Just a few bus routes
NTA_ROUTES="37,39a,46a"

# All rail + a specific bus
NTA_ROUTES="rail,37"

# All trams (Luas Green and Red lines)
NTA_ROUTES="tram"

# Everything (default when unset)
NTA_ROUTES=""

Only stops served by whitelisted routes are kept. If the filter changes between runs, the database is automatically rebuilt.

Architecture

Static data

On first run, the server downloads two datasets from Transport for Ireland:

  1. GTFS schedule — routes, stops, trips, stop times, and service calendars
  2. NaPTAN stop points — supplementary stop metadata (street names)

These are parsed and stored in a SQLite database at ~/.cache/mcp-nta/gtfs.db. On subsequent starts, the server opens the existing database with no parsing — startup is instant. The database is rebuilt automatically when it expires (controlled by NTA_REFRESH_HOURS) or when the route filter changes.

The SQLite approach means the server never loads the full dataset into memory. Queries read only the specific rows they need via indexed lookups.

Real-time data

Real-time feeds (GTFS-RT) are fetched from the NTA API on demand and cached in memory for 30 seconds:

  • TripUpdates — delays and predicted arrival times
  • VehiclePositions — live GPS positions of vehicles
  • ServiceAlerts — disruption notices

Each tool combines a targeted SQLite query against the static schedule with the relevant real-time feed to produce a concise, human-readable answer.

Example

"What buses are due at Oaktree Green for the 37?"

The LLM calls search_stops(query="Oaktree Green"), gets back stop IDs, then calls get_stop_departures(stop_id="8240DB001682", route="37") and gets:

Upcoming departures from Oaktree Green (stop 8240DB001682) for route 37:

1. 37 -> Wilton Terrace | Due: 14:22 (in 4 min) — scheduled 14:20, +1 min late
2. 37 -> Wilton Terrace | Due: 14:40 (in 22 min) — on time
3. 37 -> Wilton Terrace | Due: 15:03 (in 45 min) — scheduled (no live data)

Two tool calls, two small responses, complete answer.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mcp_nta-0.1.0.tar.gz (73.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mcp_nta-0.1.0-py3-none-any.whl (23.3 kB view details)

Uploaded Python 3

File details

Details for the file mcp_nta-0.1.0.tar.gz.

File metadata

  • Download URL: mcp_nta-0.1.0.tar.gz
  • Upload date:
  • Size: 73.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for mcp_nta-0.1.0.tar.gz
Algorithm Hash digest
SHA256 1dbff8abe3514a4653cb725817f48cc7fc4a48487f6acff012cc47a76f715eda
MD5 5db30888fb623baa00c06dadcbedf419
BLAKE2b-256 0d94213fc1b2417578179e8bdc5d6df17ba6737ef373c039e7fa7a527ff82d90

See more details on using hashes here.

Provenance

The following attestation bundles were made for mcp_nta-0.1.0.tar.gz:

Publisher: publish.yml on dmarkey/mcp-nta

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mcp_nta-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: mcp_nta-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 23.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for mcp_nta-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 86f2b4a1a750daea2c09e9180794ac15cf91651b885fcca7e27b217bcb9ef35f
MD5 01bef16cb655e85508800a2924ea7549
BLAKE2b-256 03ce4157624dc54cb5125e9f3b10b1dc8e3d7a3bb82c1253d79d9cafad45afba

See more details on using hashes here.

Provenance

The following attestation bundles were made for mcp_nta-0.1.0-py3-none-any.whl:

Publisher: publish.yml on dmarkey/mcp-nta

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page