Skip to main content

A Pydantic-based wrapper for the TfL Unified API https://api.tfl.gov.uk/. Not associated with or endorsed by TfL.

Project description

pydantic tfl api

I originally used TfL-python-api by @dhilmathy but that verision depends on the msrest package, which has been deprecated for 2+ years. I have created this package to replace it, using pydantic and requests.

This API returns data from the TfL API in a more pythonic way, using pydantic models. It's a thin wrapper around the TfL API, so you can use the TfL API documentation to see what data is available.

Installation

pip install pydantic-tfl-api

or

poetry add pydantic-tfl-api

Usage

Uses Pydantic so you can use the model_dump_json() method to fully expand all the objects in the result. See Pydantic documentation for more help.

You can obtain an API key from your profile page on the API portal although you only need this if doing more than a dozen or so requests per minute.

from pydantic_tfl_api import LineClient

token = None # only need a token if > 1 request per second

client = LineClient(token)
response_object = client.MetaModes()
# the response object is a pydantic model
# the `content`` attribute is the API response, parsed into a pydantic model
mode_array = response_object.content
# if it's an array, it's a wrapped in a `RootModel``, which means it has a root attribute containing the array
array_content = mode_array.root

print(array_content[0].modeName)

# obviously, you can chain these together
print (client.MetaModes().content.root[0].model_dump_json())
print (client.GetByModeByPathModes(modes="bus").content.root[0].model_dump_json())

# you can also use the models directly
print ([f'The {line_item.name} line is {line_item.modeName}' for line_item in client.StatusByModeByPathModesQueryDetailQuerySeverityLevel(modes="tube").content.root])

# some return enormous amounts of data with very complex models
print(client.RouteSequenceByPathIdPathDirectionQueryServiceTypesQueryExcludeCrowding(id="northern", direction="all").model_dump_json())

Class structure

Models

Pydantic models are used to represent the data returned by the TfL API, and are in the models module. There are circular references in the TfL API, so these are handled by using ForwardRef in the models. Overall, there are 117 Pydantic models in the package. Models are automatically generated from the TfL API OpenAPI documentation, so if the TfL API changes, the models will be updated to reflect this. Fields are 'santizied' to remove any reserved words in Python (class or from for example), but otherwise are identical to the TfL API. In some cases, the TfL response has no definition, so the model is a Dict[str, Any].

Some of the TfL responses are arrays, and these are wrapped in a RootModel object, which contains the array in the root attribute - for example, a LineArray model contains an array of Line objects in the root attribute. See the Pydantic documentation for more information on how to use RootModels.

Successful responses are wrapped in a ResponseModel object, which contains the cache expiry time (content_expires and shared_expires, which are the calculated expiry based on the HTTP response timestamp and the maxage/s-maxage header respectively) for use to calculate the time to live of the object, and to determine if the object is still valid - for example if implementing caching - and the response object (in the content attribute). Failures return an ApiError object, which contains the HTTP status code and the error message.

Clients

There are dedicated clients for each of the TfL APIs. These all inherit from core.Client. The method names are the same as the path IDs in the TfL API documentation, unless they are reserved words in Python, in which case they are suffixed with Query_ (there are currently none in the package).

Clients are automatically generated from the TfL API OpenAPI documentation, so if the TfL API changes, the clients will be updated to reflect this. Clients are available for all the TfL API endpoints, and are named after the endpoint, with the Client suffix. Methods are named after the path ID in the TfL API documentation, with the Query_ prefix if the path ID is a reserved word in Python (there are none in the package as far as i know), and they take the same parameters as the TfL API documentation. Here are the current clients from the endpoints module:

endpoints
├── AccidentStatsClient.py
├── AirQualityClient.py
├── BikePointClient.py
├── CrowdingClient.py
├── JourneyClient.py
├── LiftDisruptionsClient.py
├── LineClient.py
├── ModeClient.py
├── OccupancyClient.py
├── PlaceClient.py
├── RoadClient.py
├── SearchClient.py
├── StopPointClient.py
├── VehicleClient.py

Here's a Mermaid visualisation of the Pydantic models (or view online):

Development environment

The devcontainer is set up to use the poetry package manager. You can use the poetry commands to manage the environment. The poetry.lock file is checked in, so you can use poetry install --with dev --no-interaction --sync --no-root to install the dependencies (which the devcontainer does on the postCreateCommand command).

You can test the build by running ./build.sh "/workspaces/pydantic_tfl_api/pydantic_tfl_api" "/workspaces/pydantic_tfl_api/TfL_OpenAPI_specs" True in the devcontainer. This will build the package and install it in the devcontainer. You can then run the tests with pytest in the tests directory.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pydantic_tfl_api-1.2.0.tar.gz (44.2 kB view hashes)

Uploaded Source

Built Distribution

pydantic_tfl_api-1.2.0-py3-none-any.whl (94.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page