Skip to main content

Utilities for the MyLaps Event Results API

Project description

speedhive-tools

PyPI version Python 3.10+ License: MIT CI

Python toolkit for the MyLaps Event Results API. Export race events, sessions, laps, and announcements to CSV, SQLite, or JSON with a single command.

Features

  • Full Data Export — Stream events, sessions, laps, and announcements for any organization
  • Multiple Output Formats — CSV, SQLite, JSON, and compressed NDJSON
  • Memory Efficient — Streaming architecture handles large datasets without high RAM usage
  • Resumable Downloads — Checkpoint support for interrupted exports
  • Interactive CLI — Process exported data with guided prompts or batch flags

Installation

From PyPI

pip install speedhive-tools

From Source

git clone https://github.com/ncrosty58/speedhive-tools.git
cd speedhive-tools
python -m venv .venv && source .venv/bin/activate
pip install -r requirements.txt

Quick Start

1. Export Data

Export all data for an organization (events, sessions, laps, announcements):

python examples/export_full_dump.py --org 30476 --output ./output/full_dump --verbose

2. Process to CSV

Convert the exported NDJSON to flat CSV files:

python examples/processing/extract_laps_to_csv.py \
    --input output/full_dump/30476 \
    --out output/full_dump/30476/laps.csv

3. Or Use the Interactive CLI

python examples/processing/processor_cli.py

Usage

Export Commands

Command Description
export_full_dump.py --org <id> Export all data (events, sessions, laps, announcements)
list_events_by_org.py <id> List events for an organization
export_announcements_by_org.py <id> Export announcements only
get_event_sessions.py <event_id> Get sessions for a specific event
get_session_laps.py <session_id> Get lap times for a session
get_session_results.py <session_id> Get results for a session

Export Options

python examples/export_full_dump.py \
    --org 30476 \
    --output ./output/full_dump \
    --max-events 10 \
    --max-sessions-per-event 5 \
    --concurrency 2 \
    --verbose \
    --dry-run
Flag Description
--org Organization ID (required, repeatable)
--output Output directory (default: ./output/full_dump)
--max-events Limit number of events to export
--max-sessions-per-event Limit sessions per event
--concurrency Parallel request limit (default: 5)
--token API token for authenticated endpoints
--dry-run Preview without writing files
--verbose Enable detailed logging

Processing Commands

Convert exported NDJSON to analysis-ready formats:

# Extract to CSV
python examples/processing/extract_laps_to_csv.py --input <dir> --out laps.csv
python examples/processing/extract_sessions_to_csv.py --input <dir> --out sessions.csv
python examples/processing/extract_announcements_to_csv.py --input <dir> --out announcements.csv

# Import to SQLite
python examples/processing/ndjson_to_sqlite.py --input <dir>/laps.ndjson.gz --out dump.db

Processor CLI

Interactive mode for batch processing:

# Interactive - prompts for org and output options
python examples/processing/processor_cli.py

# Non-interactive - process all data types
python examples/processing/processor_cli.py --org 30476 --run-all

Project Structure

speedhive-tools/
├── mylaps_client/          # Generated OpenAPI client (event_results_client)
├── examples/
│   ├── export_full_dump.py          # Main exporter
│   ├── list_events_by_org.py        # List org events
│   ├── export_announcements_by_org.py
│   ├── get_event_sessions.py
│   ├── get_session_laps.py
│   ├── get_session_results.py
│   └── processing/
│       ├── processor_cli.py         # Interactive processor
│       ├── extract_laps_to_csv.py
│       ├── extract_sessions_to_csv.py
│       ├── extract_announcements_to_csv.py
│       └── ndjson_to_sqlite.py
├── tests/                  # Unit tests
└── output/                 # Default export location (gitignored)

Output Format

The exporter creates gzipped NDJSON files:

output/full_dump/<org_id>/
├── events.ndjson.gz
├── sessions.ndjson.gz
├── laps.ndjson.gz
├── announcements.ndjson.gz
└── .checkpoint.json        # Resume state

Development

Run Tests

pip install pytest
pytest

Regenerate API Client

If the MyLaps API spec changes:

pip install openapi-python-client
openapi-python-client generate --url https://api2.mylaps.com/v3/api-docs --output-path ./mylaps_client

Build Distribution

pip install build
python -m build

CI/CD

This project uses GitHub Actions for automated testing and PyPI publishing. Pushing a version tag triggers:

  1. Run test suite
  2. Build sdist and wheel
  3. Publish to PyPI
git tag v0.1.3
git push origin v0.1.3

Contributing

Contributions welcome! Please:

  1. Fork the repository
  2. Create a feature branch
  3. Add tests for new functionality
  4. Submit a pull request

License

MIT © Nathan Crosty

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

speedhive_tools-0.1.3.tar.gz (143.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

speedhive_tools-0.1.3-py3-none-any.whl (67.4 kB view details)

Uploaded Python 3

File details

Details for the file speedhive_tools-0.1.3.tar.gz.

File metadata

  • Download URL: speedhive_tools-0.1.3.tar.gz
  • Upload date:
  • Size: 143.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.9

File hashes

Hashes for speedhive_tools-0.1.3.tar.gz
Algorithm Hash digest
SHA256 4d020215478d8246457e1fd95e2eda1a9abf65873a40aaa6e8433fc5d6382740
MD5 45f9ddbe547f1bf87f0167aca6a1cf65
BLAKE2b-256 135519a23ac9ae5fefe5ecc0ead0e247199022e78d452b231250383bbe319358

See more details on using hashes here.

File details

Details for the file speedhive_tools-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for speedhive_tools-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 3c5f8146f47081220526b9ce652f3d51eca6344ee6aabe213f5a3caef1fb7ba5
MD5 d8bef62eaefff9416e213b04adf471b9
BLAKE2b-256 ef663a30e3403343172e644acf753e64b7cf7d964dc9307b6097d82fc58c996a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page