Skip to main content

GTFS-RT Fetcher and Aggregator to Parquet format

Project description

GTFS-RT Aggregator

This project provides a pipeline for fetching, storing, and aggregating GTFS-RT (General Transit Feed Specification - Realtime) data from multiple providers into Parquet format.

Features

  • Fetch GTFS-RT data from multiple providers and APIs
  • Store individual data files in Parquet format with multiple storage backends (filesystem, Google Cloud Storage, MinIO)
  • Aggregate data files based on configurable time intervals
  • Run fetcher and aggregator services in parallel
  • Configurable via a single TOML configuration file

Requirements

  • Python 3.11+
  • Required Python packages (see requirements.txt):
    • requests
    • gtfs-realtime-bindings
    • pandas
    • pyarrow
    • schedule
    • pydantic
    • google-cloud-storage (optional, for GCS storage)
    • minio (optional, for MinIO storage)

Installation

From PyPI (recommended)

pip install gtfs-rt-aggregator

From Source

  1. Clone this repository:

    git clone https://github.com/GaspardMerten/gtfs-rt-aggregator
    cd gtfs-rt-aggregator
    
  2. Install in development mode:

    pip install -e .
    

Configuration

The pipeline is configured using a TOML configuration file. Here's an example:

# GTFS-RT Configuration File
[storage]
type = "filesystem"  # Options: "filesystem", "gcs", or "minio"
[storage.params]
base_directory = "data"  # Base directory for filesystem storage

# Provider configurations
[[providers]]
name = "ovapi"
timezone = "Europe/Amsterdam"

  [[providers.apis]]
  url = "https://gtfs.ovapi.nl/nl/vehiclePositions.pb"
  services = ["VehiclePosition"]
  refresh_seconds = 20  # Fetch every 20 seconds
  frequency_minutes = 60  # Group files in 60-minute intervals
  check_interval_seconds = 300  # Check for new files every 5 minutes

  [[providers.apis]]
  url = "https://gtfs.ovapi.nl/nl/tripUpdates.pb"
  services = ["TripUpdate"]
  refresh_seconds = 20  # Fetch every 20 seconds

Storage Backend Examples

Google Cloud Storage

[storage]
type = "gcs"
[storage.params]
bucket_name = "my-gtfs-bucket"
base_path = "gtfs-data"  # Optional: subfolder within the bucket
# Authentication is handled via the GOOGLE_APPLICATION_CREDENTIALS environment variable

MinIO Storage

[storage]
type = "minio"
[storage.params]
endpoint = "minio.example.com:9000"
access_key = "YOUR_ACCESS_KEY"
secret_key = "YOUR_SECRET_KEY"
bucket_name = "gtfs-data"
secure = true  # Use HTTPS
base_path = "gtfs-feeds"  # Optional: subfolder within the bucket

Configuration Options

  • storage: Global storage configuration

    • type: Storage backend type ("filesystem", "gcs", or "minio")
    • params: Backend-specific parameters
  • providers: List of GTFS-RT data providers

    • name: Name of the provider (used for directory structure)
    • timezone: Timezone for the provider's data
    • apis: List of API endpoints for this provider
      • url: URL of the GTFS-RT feed
      • services: List of service types to extract from the feed (VehiclePosition, TripUpdate, Alert)
      • refresh_seconds: How often to fetch data from this API
      • frequency_minutes: The time interval (in minutes) for grouping files
      • check_interval_seconds: How often to check for new files to aggregate

Usage

Command Line

Run the pipeline with a configuration file:

gtfs-rt-pipeline configuration.toml

You can adjust the logging level with the --log-level parameter:

gtfs-rt-pipeline configuration.toml --log-level DEBUG

Programmatic Usage

from gtfs_rt_aggregator import run_pipeline_from_toml

# Run pipeline from a TOML file
run_pipeline_from_toml("configuration.toml")

Or with a configuration object:

from gtfs_rt_aggregator.config.loader import load_config_from_toml
from gtfs_rt_aggregator import run_pipeline

# Load configuration
config = load_config_from_toml("configuration.toml")

# Run pipeline
run_pipeline(config)

Project Structure

src/gtfs_rt_aggregator/
  ├── __init__.py                # Package initialization
  ├── pipeline.py                # Main pipeline implementation
  ├── aggregator/                # Aggregation functionality
  ├── config/                    # Configuration loading and validation
  ├── fetcher/                   # GTFS-RT data fetching functionality
  ├── storage/                   # Storage backend implementations
  └── utils/                     # Utility functions and helpers
      ├── cli.py                 # Command-line interface
      └── ...

License

MIT License

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gtfs_rt_aggregator-0.1.3.tar.gz (23.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

gtfs_rt_aggregator-0.1.3-py3-none-any.whl (30.2 kB view details)

Uploaded Python 3

File details

Details for the file gtfs_rt_aggregator-0.1.3.tar.gz.

File metadata

  • Download URL: gtfs_rt_aggregator-0.1.3.tar.gz
  • Upload date:
  • Size: 23.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for gtfs_rt_aggregator-0.1.3.tar.gz
Algorithm Hash digest
SHA256 d1881534ae30d48c91314fb77ad39357a1d0cfd14631cc0eb1c6790c93dc2022
MD5 0f18ce26893d2a01bd4d2a5dc8bc200e
BLAKE2b-256 499e090709748e9c1222bf4319c656d6d274de1192d2aa10b280a05e3dc3f7e4

See more details on using hashes here.

Provenance

The following attestation bundles were made for gtfs_rt_aggregator-0.1.3.tar.gz:

Publisher: python-publish.yml on GaspardMerten/gtfs-rt-aggregator

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file gtfs_rt_aggregator-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for gtfs_rt_aggregator-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 59ad05a562f61d48b74c08f3cc167b6f7096faae9863070752f759e3713adb22
MD5 a340598d5ff9c625cece68720f4dcb4b
BLAKE2b-256 c9a3c167e82c5bf7a5006360f988f43ba5cc5f61fef61be68c079265553bd7d3

See more details on using hashes here.

Provenance

The following attestation bundles were made for gtfs_rt_aggregator-0.1.3-py3-none-any.whl:

Publisher: python-publish.yml on GaspardMerten/gtfs-rt-aggregator

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page