Skip to main content

A Pydantic-based wrapper for the TfL Unified API https://api.tfl.gov.uk/. Not associated with or endorsed by TfL.

Project description

pydantic tfl api

I originally used TfL-python-api by @dhilmathy but that verision depends on the msrest package, which has been deprecated for 2+ years. I have created this package to replace it, using pydantic and requests.

This API returns data from the TfL API in a more pythonic way, using pydantic models. It's a thin wrapper around the TfL API, so you can use the TfL API documentation to see what data is available.

Installation

pip install pydantic-tfl-api

or

poetry add pydantic-tfl-api

Usage

Uses Pydantic so you can use the model_dump_json() method to fully expand all the objects in the result. See Pydantic documentation for more help.

You can obtain an API key from your profile page on the API portal although you only need this if doing more than a dozen or so requests per minute.

from pydantic_tfl_api.client import Client

token = "your_secret_token from https://api-portal.tfl.gov.uk/profile"

client = Client(token)
print (client.get_line_meta_modes())
print (client.get_lines(mode="bus")[0].model_dump_json())
print (client.get_lines(line_id="victoria")[0].model_dump_json())
print (client.get_route_by_line_id_with_direction(line_id="northern", direction="all").model_dump_json())

Class structure

The Pydantic classes are in the tfl.models module. The tfl.client module contains the Client class, which is the main class you will use to interact with the API.

Pydantic models are used to represent the data returned by the TfL API. There is a circular reference in the TfL API, so I handled this in the StopPoint model to load the Line model only after StopPoint is fully loaded.

The following objects represent responses from the TfL API, and are therefore returned by the Client class methods - either individually or as an array of objects:

  • StopPoint
  • Mode
  • Line
  • RouteSequence
  • Disruption
  • StopPointsResponse
  • Prediction

These objects contains two properties content_expires and shared_expires, which are the calculated expiry based on the HTTP response timestamp and the maxage/s-maxage header respectively. You can use these to calculate the time to live of the object, and to determine if the object is still valid - for example if implementing caching.

Here's a Mermaid visualisation of the Pydantic models (or view online):

classDiagram
    direction LR
    class AffectedRoute
    class RouteSectionNaptanEntrySequence
    AffectedRoute ..> RouteSectionNaptanEntrySequence

    class PredictionTiming
    class Prediction
    Prediction ..> PredictionTiming 

    class Crowding
    class PassengerFlow
    class TrainLoading
    Crowding ..> PassengerFlow
    Crowding ..> TrainLoading

    class Identifier
    Identifier ..> Crowding

    class OrderedRoute
    class RouteSequence
    class MatchedStop
    RouteSequence ..> MatchedStop
    RouteSequence ..> OrderedRoute

    class ApiError
    class Mode


    class Line
    Line ..> Disruption
    Line ..> LineStatus
    Line ..> RouteSection
    Line ..> ServiceType
    Line ..> Crowding

    class RouteSection
    class ServiceType
    class ValidityPeriod
    class LineStatus
    class Disruption

    class StopPoint
    class StopPointsResponse
    class AdditionalProperties
    class StopPointsResponse
    class LineModeGroup
    class LineGroup



    LineStatus ..> ValidityPeriod
    LineStatus ..> Disruption

    Disruption ..> AffectedRoute
    MatchedStop ..> Identifier 

    RouteSectionNaptanEntrySequence ..> StopPoint

    StopPoint ..> LineGroup
    StopPoint ..> LineModeGroup
    StopPoint ..> AdditionalProperties
    StopPoint ..> Line
    StopPointsResponse ..> StopPoint

Development environment

The devcontainer is set up to use the poetry package manager. You can use the poetry commands to manage the environment. The poetry.lock file is checked in, so you can use poetry install --with dev --no-interaction --sync --no-root to install the dependencies (which the devcontainer does on the postCreateCommand command).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pydantic_tfl_api-0.27.2.tar.gz (13.9 kB view details)

Uploaded Source

Built Distribution

pydantic_tfl_api-0.27.2-py3-none-any.whl (23.4 kB view details)

Uploaded Python 3

File details

Details for the file pydantic_tfl_api-0.27.2.tar.gz.

File metadata

  • Download URL: pydantic_tfl_api-0.27.2.tar.gz
  • Upload date:
  • Size: 13.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for pydantic_tfl_api-0.27.2.tar.gz
Algorithm Hash digest
SHA256 3049f8447eb6ed670667f72b09ce710ab965db497f87591e3cfdd03e57c593e7
MD5 b3df6b18a34ee0a72e771ed363197e12
BLAKE2b-256 e070067bbf578dc1affd7c34dc4e9901f02fbd28895ca1653352bf1bdcc8b8ad

See more details on using hashes here.

Provenance

File details

Details for the file pydantic_tfl_api-0.27.2-py3-none-any.whl.

File metadata

File hashes

Hashes for pydantic_tfl_api-0.27.2-py3-none-any.whl
Algorithm Hash digest
SHA256 35937d4878eba31fdd3a512eb438d996241241169774fd77ad6ae05fd7fcb5d6
MD5 c70af0d451bc04e98007d16468f7608a
BLAKE2b-256 b1a74243511cd28b5c7202cfb616a4e4e9199f1b3dd5d97a5e87acb96cce745f

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page