Skip to main content

A Pydantic-based wrapper for the TfL Unified API https://api-portal.tfl.gov.uk/. Not associated with or endorsed by TfL.

Project description

pydantic tfl api

I originally used TfL-python-api by @dhilmathy but that verision depends on the msrest package, which has been deprecated for 2+ years. I have created this package to replace it, using pydantic and requests.

This API returns data from the TfL API in a more pythonic way, using pydantic models. It's a thin wrapper around the TfL API, so you can use the TfL API documentation to see what data is available.

Installation

pip install pydantic-tfl-api

or

poetry add pydantic-tfl-api

Usage

Uses Pydantic so you can use the model_dump_json() method to fully expand all the objects in the result. See Pydantic documentation for more help.

You can obtain an API key from your profile page on the API portal although you only need this if doing more than a dozen or so requests per minute.

from pydantic_tfl_api import LineClient

token = None # only need a token if > 1 request per second

client = LineClient(token)
response_object = client.MetaModes()
# the response object is a pydantic model
# the `content`` attribute is the API response, parsed into a pydantic model
mode_array = response_object.content
# if it's an array, it's a wrapped in a `RootModel``, which means it has a root attribute containing the array
array_content = mode_array.root

print(array_content[0].modeName)

# obviously, you can chain these together
print (client.MetaModes().content.root[0].model_dump_json())
print (client.GetByModeByPathModes(modes="bus").content.root[0].model_dump_json())

# you can also use the models directly
print ([f'The {line_item.name} line is {line_item.modeName}' for line_item in client.StatusByModeByPathModesQueryDetailQuerySeverityLevel(modes="tube").content.root])

# some return enormous amounts of data with very complex models
print(client.RouteSequenceByPathIdPathDirectionQueryServiceTypesQueryExcludeCrowding(id="northern", direction="all").model_dump_json())

Class structure

Models

Pydantic models are used to represent the data returned by the TfL API, and are in the models module. There are circular references in the TfL API, so these are handled by using ForwardRef in the models. Overall, there are 117 Pydantic models in the package. Models are automatically generated from the TfL API OpenAPI documentation, so if the TfL API changes, the models will be updated to reflect this. Fields are 'santizied' to remove any reserved words in Python (class or from for example), but otherwise are identical to the TfL API. In some cases, the TfL response has no definition, so the model is a Dict[str, Any].

Some of the TfL responses are arrays, and these are wrapped in a RootModel object, which contains the array in the root attribute - for example, a LineArray model contains an array of Line objects in the root attribute. See the Pydantic documentation for more information on how to use RootModels.

Successful responses are wrapped in a ResponseModel object, which contains the cache expiry time (content_expires and shared_expires, which are the calculated expiry based on the HTTP response timestamp and the maxage/s-maxage header respectively) for use to calculate the time to live of the object, and to determine if the object is still valid - for example if implementing caching - and the response object (in the content attribute). Failures return an ApiError object, which contains the HTTP status code and the error message.

Clients

There are dedicated clients for each of the TfL APIs. These all inherit from core.Client. The method names are the same as the path IDs in the TfL API documentation, unless they are reserved words in Python, in which case they are suffixed with Query_ (there are currently none in the package).

Clients are automatically generated from the TfL API OpenAPI documentation, so if the TfL API changes, the clients will be updated to reflect this. Clients are available for all the TfL API endpoints, and are named after the endpoint, with the Client suffix. Methods are named after the path ID in the TfL API documentation, with the Query_ prefix if the path ID is a reserved word in Python (there are none in the package as far as i know), and they take the same parameters as the TfL API documentation. Here are the current clients from the endpoints module:

endpoints
├── AccidentStatsClient.py
├── AirQualityClient.py
├── BikePointClient.py
├── CrowdingClient.py
├── JourneyClient.py
├── LiftDisruptionsClient.py
├── LineClient.py
├── ModeClient.py
├── OccupancyClient.py
├── PlaceClient.py
├── RoadClient.py
├── SearchClient.py
├── StopPointClient.py
├── VehicleClient.py

Here's a Mermaid visualisation of the Pydantic models (or view online):

Development environment

The devcontainer is set up to use the poetry package manager. You can use the poetry commands to manage the environment. The poetry.lock file is checked in, so you can use poetry install --with dev --no-interaction --sync --no-root to install the dependencies (which the devcontainer does on the postCreateCommand command).

You can test the build by running ./build.sh "/workspaces/pydantic_tfl_api/pydantic_tfl_api" "/workspaces/pydantic_tfl_api/TfL_OpenAPI_specs" True in the devcontainer. This will build the package and install it in the devcontainer. You can then run the tests with pytest in the tests directory.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pydantic_tfl_api-1.2.2.tar.gz (44.2 kB view details)

Uploaded Source

Built Distribution

pydantic_tfl_api-1.2.2-py3-none-any.whl (94.7 kB view details)

Uploaded Python 3

File details

Details for the file pydantic_tfl_api-1.2.2.tar.gz.

File metadata

  • Download URL: pydantic_tfl_api-1.2.2.tar.gz
  • Upload date:
  • Size: 44.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for pydantic_tfl_api-1.2.2.tar.gz
Algorithm Hash digest
SHA256 29977ae8975b269a7b9a825a5501cbaec29ff7e1dfbac167680601cb67a9cd86
MD5 a918bd47282ad770321a9a3b668bc530
BLAKE2b-256 a4cec09a6267e197e1f7c9774537b4d114f0fcf609e5f1f77aba76dd739f4753

See more details on using hashes here.

File details

Details for the file pydantic_tfl_api-1.2.2-py3-none-any.whl.

File metadata

File hashes

Hashes for pydantic_tfl_api-1.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 b83f25a3b2b051d119d602b8c082cd6aec2efd080d96451f5a94afce898e2abc
MD5 ff2bc8e26e47b97b5f689a82cbe8282b
BLAKE2b-256 0dc9c2d377d30ca9f9edfa69ba1bedb025e14d4395ee9a3744211105d8faec40

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page