Skip to main content

A Pydantic-based wrapper for the TfL Unified API https://api.tfl.gov.uk/. Not associated with or endorsed by TfL.

Project description

pydantic tfl api

I originally used TfL-python-api by @dhilmathy but that verision depends on the msrest package, which has been deprecated for 2+ years. I have created this package to replace it, using pydantic and requests.

This API returns data from the TfL API in a more pythonic way, using pydantic models. It's a thin wrapper around the TfL API, so you can use the TfL API documentation to see what data is available.

Installation

pip install pydantic-tfl-api

or

poetry add pydantic-tfl-api

Usage

Uses Pydantic so you can use the model_dump_json() method to fully expand all the objects in the result. See Pydantic documentation for more help.

You can obtain an API key from your profile page on the API portal although you only need this if doing more than a dozen or so requests per minute.

from pydantic_tfl_api import LineClient

token = None # only need a token if > 1 request per second

client = LineClient(token)
response_object = client.MetaModes()
# the response object is a pydantic model
# the `content`` attribute is the API response, parsed into a pydantic model
mode_array = response_object.content
# if it's an array, it's a wrapped in a `RootModel``, which means it has a root attribute containing the array
array_content = mode_array.root

print(array_content[0].modeName)

# obviously, you can chain these together
print (client.MetaModes().content.root[0].model_dump_json())
print (client.GetByModeByPathModes(modes="bus").content.root[0].model_dump_json())

# you can also use the models directly
print ([f'The {line_item.name} line is {line_item.modeName}' for line_item in client.StatusByModeByPathModesQueryDetailQuerySeverityLevel(modes="tube").content.root])

# some return enormous amounts of data with very complex models
print(client.RouteSequenceByPathIdPathDirectionQueryServiceTypesQueryExcludeCrowding(id="northern", direction="all").model_dump_json())

Class structure

Models

Pydantic models are used to represent the data returned by the TfL API, and are in the models module. There are circular references in the TfL API, so these are handled by using ForwardRef in the models. Overall, there are 117 Pydantic models in the package. Models are automatically generated from the TfL API OpenAPI documentation, so if the TfL API changes, the models will be updated to reflect this. Fields are 'santizied' to remove any reserved words in Python (class or from for example), but otherwise are identical to the TfL API. In some cases, the TfL response has no definition, so the model is a Dict[str, Any].

Some of the TfL responses are arrays, and these are wrapped in a RootModel object, which contains the array in the root attribute - for example, a LineArray model contains an array of Line objects in the root attribute. See the Pydantic documentation for more information on how to use RootModels.

Successful responses are wrapped in a ResponseModel object, which contains the cache expiry time (content_expires and shared_expires, which are the calculated expiry based on the HTTP response timestamp and the maxage/s-maxage header respectively) for use to calculate the time to live of the object, and to determine if the object is still valid - for example if implementing caching - and the response object (in the content attribute). Failures return an ApiError object, which contains the HTTP status code and the error message.

Clients

There are dedicated clients for each of the TfL APIs. These all inherit from core.Client. The method names are the same as the path IDs in the TfL API documentation, unless they are reserved words in Python, in which case they are suffixed with Query_ (there are currently none in the package).

Clients are automatically generated from the TfL API OpenAPI documentation, so if the TfL API changes, the clients will be updated to reflect this. Clients are available for all the TfL API endpoints, and are named after the endpoint, with the Client suffix. Methods are named after the path ID in the TfL API documentation, with the Query_ prefix if the path ID is a reserved word in Python (there are none in the package as far as i know), and they take the same parameters as the TfL API documentation. Here are the current clients from the endpoints module:

endpoints
├── AccidentStatsClient.py
├── AirQualityClient.py
├── BikePointClient.py
├── CrowdingClient.py
├── JourneyClient.py
├── LiftDisruptionsClient.py
├── LineClient.py
├── ModeClient.py
├── OccupancyClient.py
├── PlaceClient.py
├── RoadClient.py
├── SearchClient.py
├── StopPointClient.py
├── VehicleClient.py

Here's a Mermaid visualisation of the Pydantic models (or view online):

Development environment

The devcontainer is set up to use the poetry package manager. You can use the poetry commands to manage the environment. The poetry.lock file is checked in, so you can use poetry install --with dev --no-interaction --sync --no-root to install the dependencies (which the devcontainer does on the postCreateCommand command).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pydantic_tfl_api-1.0.0.tar.gz (43.8 kB view details)

Uploaded Source

Built Distribution

pydantic_tfl_api-1.0.0-py3-none-any.whl (94.4 kB view details)

Uploaded Python 3

File details

Details for the file pydantic_tfl_api-1.0.0.tar.gz.

File metadata

  • Download URL: pydantic_tfl_api-1.0.0.tar.gz
  • Upload date:
  • Size: 43.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.6

File hashes

Hashes for pydantic_tfl_api-1.0.0.tar.gz
Algorithm Hash digest
SHA256 84dc7d40f9861ac7097e82f4958f3387418909c3a8f570670827a1f94f06e15b
MD5 261e98b463bf38000c8c3beb42503b68
BLAKE2b-256 98990e3bbb8d7d06a6ec3564546536fe2842aadcccd134571fe809846c178bf5

See more details on using hashes here.

Provenance

File details

Details for the file pydantic_tfl_api-1.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for pydantic_tfl_api-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ee00596fe5061607feb2aa19e491daf64716c9a65d00701724021bbadf255ee8
MD5 108dfc126e6ba89c8252a121c48d5794
BLAKE2b-256 134ea8a59bd2b1c097dc1c489c9c50699417d0d1ae7687822096e8d4ab39f7a9

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page