Utilities for the MyLaps Event Results API
Project description
speedhive-tools
Utilities and examples for interacting with the MyLaps / Event Results API using a locally-generated OpenAPI Python client.
This repository contains a generated client under mylaps_client/ and example scripts that demonstrate how to list events and export session announcements for an organization.
Table of contents
- Quick start
- Examples
- Exporter details
- Troubleshooting
- Regenerating the client
- Contributing
Quick start
Requirements
- Python 3.10+
pip install -r requirements.txt
speedhive-tools
Utilities and examples for interacting with the MyLaps / Event Results API using a locally-generated OpenAPI Python client.
This repo includes a generated client and example scripts for exporting and processing event/session/lap data.
Table of contents
- Quick Start
- What’s in this repo
- Common commands
- Process exported data
- Notes & tips
- Regenerating the client
- Testing and CI
- Contributing and next steps
Quick Start
Requirements: Python 3.10+ and a virtualenv.
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
What’s in this repo
mylaps_client/— generated OpenAPI Python client (importable asevent_results_clientwhen running examples from the repo root).examples/— runnable examples that demonstrate common API tasks.examples/processing/— data-processing helpers (convert NDJSON -> CSV/SQLite) and an interactive CLI.output/— suggested place for example outputs (this directory is in.gitignore).
Common commands
- List events for an org:
python examples/list_events_by_org.py 30476 --verbose
- Export announcements for an org (per-event JSON files):
python examples/export_announcements_by_org.py 30476 --output ./output/announcements --verbose
- Full dump (stream NDJSON, gzipped by default):
python examples/export_full_dump.py --org 30476 --output ./output/full_dump --verbose
Process exported data
- Extract laps to CSV:
python examples/processing/extract_laps_to_csv.py --input output/full_dump/30476 --out output/full_dump/30476/laps_flat.csv
- Extract sessions to CSV:
python examples/processing/extract_sessions_to_csv.py --input output/full_dump/30476 --out output/full_dump/30476/sessions_flat.csv
- Extract announcements to CSV:
python examples/processing/extract_announcements_to_csv.py --input output/full_dump/30476 --out output/full_dump/30476/announcements_flat.csv
- Import laps to SQLite:
python examples/processing/ndjson_to_sqlite.py --input output/full_dump/30476/laps.ndjson.gz --out output/full_dump/30476/dump.db
sqlite3 output/full_dump/30476/dump.db "SELECT COUNT(*) FROM laps;"
- Interactive processor CLI (scan
output/full_dump/and run steps):
python examples/processing/processor_cli.py
# or non-interactive for a specific org
python examples/processing/processor_cli.py --org 30476 --run-all
Notes & tips
- Run examples from the repository root so the local
mylaps_clientpackage is onsys.path. - Use
--tokenon example CLIs when endpoints require authentication. - The exporter supports
--max-events,--max-sessions-per-event, and--dry-runfor low-memory testing. - Long runs write a checkpoint file (
outdir/.checkpoint.json) so you can resume after interruptions.
Regenerating the client
If the API OpenAPI spec changes, regenerate the client and place it under mylaps_client/.
Example using openapi-python-client:
python -m openapi_python_client generate --url https://api2.mylaps.com/v3/api-docs --output-path ./mylaps_client
Testing and CI
- There are minimal tests under
tests/(including processing extractor tests). Add CI and recorded fixtures if you want reproducible runs in CI.
Contributing and next steps
If you'd like I can implement any of the following:
- Add retries/backoff to the exporter and recorded fixtures for CI.
- Add extra extractors (results/classifications) or tune CSV columns.
- Add a GitHub Actions workflow to build and publish to PyPI on tag.
If you'd like me to change the TOC style (e.g. a shorter TOC or grouped sections), tell me which layout you prefer and I'll update it.
-
Add a
--concurrencyCLI flag to control parallelism. -
Add an
--aggregateflag to emit a single combined file for all events instead of per-event files. -
Add a unit test that verifies exporter output using recorded fixtures (recommended for CI).
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file speedhive_tools-0.1.1.tar.gz.
File metadata
- Download URL: speedhive_tools-0.1.1.tar.gz
- Upload date:
- Size: 137.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
eadea3ac689dee6de30ed4f090b67c2aab973d0bbca10584c23565e01926e86b
|
|
| MD5 |
66c4df099c203b69e699e25749659ed9
|
|
| BLAKE2b-256 |
2997f9bf535f7db462ac14df2e3be8dab46642aa9279cdcde0e0cb063de66675
|
File details
Details for the file speedhive_tools-0.1.1-py3-none-any.whl.
File metadata
- Download URL: speedhive_tools-0.1.1-py3-none-any.whl
- Upload date:
- Size: 62.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
75f5f2cd1d4a752fe372d02fe75aa15720b7638d2b7a3b0e933648b91ef567ad
|
|
| MD5 |
8adc52a8afb5be3adaa4e6b8986d111b
|
|
| BLAKE2b-256 |
af0fda0226662aa17f8dc7402c40ec4692c62ea74a2beba10471ca3536892b91
|