Skip to main content

TangoGQL implementation using Ariadne

Project description

TangoGQL

Introduction

This is a GraphQL schema and implementation for accessing a Tango control system. It's motivating use case is as backend for the Taranta project. But it can be used as a generic GraphQL API for Tango.

GraphQL is a standard, there are many resources to learn about it.

This project is based on Ariadne (https://ariadnegraphql.org).

Note: this repo replaces the older implementation at https://gitlab.com/tango-controls/web/tangogql. Versions above 2.0 are based on this repo.

TODO

  • Some parts of the schema are not implemented (mainly, the "domain/family/member")
  • Think about better ways of structuring the code.

Usage

Install (preferrably in a virtualenv) from PyPI with:

pip install tangogql

To run (assuming you have TANGO_HOST setup to point to an accessible control system):

uvicorn tangogql.dev:app

Then visit http://localhost:8000 for an interactive GraphIQL "playground". This is probably the best way to get acquainted with the schema.

For a "Taranta compatible" setup:

uvicorn tangogql.main:app --port <port>

To use with taranta, start taranta dev server locally, access /testdb/ and look for the error message [HPM] Error occurred while trying to proxy request /testdb/db from localhost:3001 to http://localhost:22484 in the terminal. Here, port number 22484 is the one to use for tangogql above.

It is also possible to use TangoGQL as a route in another web application. See Ariadne's documentation.

Configuration

The server can be configured either using environment variables, or a config.json file. See settings.py for the available settings. For a production setting, you should at least set the TANGOGQL_SECRET to something random, it's used for encrypting client auth data.

For local testing, you can set TANGOGQL_NO_AUTH=true to disable all the auth stuff.

You can override logging config by pointing logging_config to a different file than the default logging.ini.

Testing

pip install -e .[tests]
pytest

Development tips

  • The repo contains a pre-commit configuration that is also enforced by the CI jobs, and it's recommended to also use it locally. The point is to keep committed code clean and consistently formatted. Commits may fail either because linting caught some issue, or because code reformatting resulted in changes.

      pip install pre-commit
      pre-commit install
    
  • Running uvicorn with --log-level=debug provides more feedback on what's happening.

  • It may also be useful to set the environment variable PYTHONASYNCIODEBUG=1 which enables some asyncio debugging tools. This can help track down problems with tasks/coroutines.

  • The repo contains a requirements.txt file to pin versions when building the docker image. That file was generated by running uv pip compile pyproject.toml -o requirements.txt. You could just use pip freeze, but uv pip compile gives a nicer output giving you an idea from where the dependencies come from. You can update a single dependency with --upgrade-package. I'd recommend to just delete and re-create the file from time to time to update all dependencies. See the uv pip compile documentation.

Navigating the source code

While the codebase isn't large, the structure may be a little hard to follow. This is because Ariadne works by starting with the tangogql.graphql schema file, and then pulling the code together based on that structure.

Firstly you may want to get familiar with the Ariadne docs and the GraphQL standard.

Now, if you want to know how e.g. query { device(name:"...") { alias } } is retrieved, check the definition in the schema file:

...
type Query {
    ...
    device(name: String!): Device
    ...
}
...

What we are looking for is the toplevel Query object type, and the field "device" (which has Device object type).

The code base is organized along the top level fields; "query", "subscription", and "mutation". So, look in the query/query.py file. Then find the "resolver" for the "device" field, i.e.

@query.field("device")
def resolve_device(_, info, name: str) -> DbDeviceInfo:
    return info.context["device_loader"].load(name)

This function gets run in order to get data for each device query. It uses a "dataloader" which is basically a way to get data in a more efficient way by batching calls for the whole query. You can look in loaders.py for more info, but the important thing here is that it returns a DbDeviceInfo object for our device. This is a simple database information class.

In this case, "alias" is directly available on this object, which means it will automatically get picked up by Ariadne and returned. Done!

Let's look for some slightly more complicated Device field such as connected; search the code for ObjectType("Device") to find where it's defined, and then look for the resolver function for the "connected" field, like above. You should find a function that gets our DbDeviceInfo object and does some work to check if the device responds, and returns a boolean depending on that. Done.

With this knowledge, you should be able to navigate the code by looking for the correct object types and resolvers, i.e. a resolver can return a new object, which gets resolved, and so on. The chain can go many steps deep but the pattern is the same.

Running Unit Tests in Docker for TangoGQL

To run unit tests in Docker for the TangoGQL project, follow these steps. These instructions assume you are using PowerShell:

Steps to Run Unit Tests in Docker

  1. Build the Docker Image
    Open PowerShell and build the Docker image from the project directory:

    docker build -t tangogql .
    
  2. Run the container

    docker run --rm -it tangogql bash
    
  3. Install test Dependencies Install the test dependencies for TangoGQL (TangoGQL itself is already installed)

    uv pip install ".[tests]"
    
  4. Run Unit Tests Finally, run the test suite:

    pytest
    

These steps will set up and run the unit tests within the Docker container.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tangogql-2.2.2.tar.gz (85.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tangogql-2.2.2-py3-none-any.whl (57.9 kB view details)

Uploaded Python 3

File details

Details for the file tangogql-2.2.2.tar.gz.

File metadata

  • Download URL: tangogql-2.2.2.tar.gz
  • Upload date:
  • Size: 85.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for tangogql-2.2.2.tar.gz
Algorithm Hash digest
SHA256 053c269aeed7167be201c65389c8939fef605cc52d7a11e03b06b26b6f4938e9
MD5 61037a65ea12e1ba92a5867029b4e55f
BLAKE2b-256 5f3210c231c15fccf054ebdba0ee4b37a2bce3e7fa320bf3ccbb4f32fed7721c

See more details on using hashes here.

File details

Details for the file tangogql-2.2.2-py3-none-any.whl.

File metadata

  • Download URL: tangogql-2.2.2-py3-none-any.whl
  • Upload date:
  • Size: 57.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for tangogql-2.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 92e8b7e19f3c8d34853897255f27a0754ce70e4342eef639d19d6dd6bba6b34b
MD5 adec6dfc3d87135878fe1d0faf900d1e
BLAKE2b-256 b7dcd7167dea46edd23816af8ec21e1423ad8107e53a7fb81e3f76dc1aab2cdc

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page