Skip to main content

TangoGQL implementation using Ariadne

Project description

TangoGQL

Introduction

This is a GraphQL schema and implementation for accessing a Tango control system. It's motivating use case is as backend for the Taranta project. But it can be used as a generic GraphQL API for Tango.

GraphQL is a standard, there are many resources to learn about it.

This project is based on Ariadne (https://ariadnegraphql.org).

Note: this repo replaces the older implementation at https://gitlab.com/tango-controls/web/tangogql. Versions above 2.0 are based on this repo.

TODO

  • Some parts of the schema are not implemented (mainly, the "domain/family/member")
  • Think about better ways of structuring the code.

Usage

Install (preferrably in a virtualenv) from PyPI with:

pip install tangogql

To run (assuming you have TANGO_HOST setup to point to an accessible control system):

uvicorn tangogql.dev:app

Then visit http://localhost:8000 for an interactive GraphIQL "playground". This is probably the best way to get acquainted with the schema.

For a "Taranta compatible" setup:

uvicorn tangogql.main:app --port <port>

To use with taranta, start taranta dev server locally, access /testdb/ and look for the error message [HPM] Error occurred while trying to proxy request /testdb/db from localhost:3001 to http://localhost:22484 in the terminal. Here, port number 22484 is the one to use for tangogql above.

It is also possible to use TangoGQL as a route in another web application. See Ariadne's documentation.

Configuration

The server can be configured either using environment variables, or a config.json file. See settings.py for the available settings. For a production setting, you should at least set the TANGOGQL_SECRET to something random, it's used for encrypting client auth data.

For local testing, you can set TANGOGQL_NO_AUTH=true to disable all the auth stuff.

You can override logging config by pointing logging_config to a different file than the default logging.ini.

Testing

pip install -e .[tests]
pytest

Development tips

  • The repo contains a pre-commit configuration that is also enforced by the CI jobs, and it's recommended to also use it locally. The point is to keep committed code clean and consistently formatted. Commits may fail either because linting caught some issue, or because code reformatting resulted in changes.

      pip install pre-commit
      pre-commit install
    
  • Running uvicorn with --log-level=debug provides more feedback on what's happening.

  • It may also be useful to set the environment variable PYTHONASYNCIODEBUG=1 which enables some asyncio debugging tools. This can help track down problems with tasks/coroutines.

  • The repo contains a requirements.txt file to pin versions when building the docker image. That file was generated by running uv pip compile pyproject.toml -o requirements.txt. You could just use pip freeze, but uv pip compile gives a nicer output giving you an idea from where the dependencies come from. You can update a single dependency with --upgrade-package. I'd recommend to just delete and re-create the file from time to time to update all dependencies. See the uv pip compile documentation.

Navigating the source code

While the codebase isn't large, the structure may be a little hard to follow. This is because Ariadne works by starting with the tangogql.graphql schema file, and then pulling the code together based on that structure.

Firstly you may want to get familiar with the Ariadne docs and the GraphQL standard.

Now, if you want to know how e.g. query { device(name:"...") { alias } } is retrieved, check the definition in the schema file:

...
type Query {
    ...
    device(name: String!): Device
    ...
}
...

What we are looking for is the toplevel Query object type, and the field "device" (which has Device object type).

The code base is organized along the top level fields; "query", "subscription", and "mutation". So, look in the query/query.py file. Then find the "resolver" for the "device" field, i.e.

@query.field("device")
def resolve_device(_, info, name: str) -> DbDeviceInfo:
    return info.context["device_loader"].load(name)

This function gets run in order to get data for each device query. It uses a "dataloader" which is basically a way to get data in a more efficient way by batching calls for the whole query. You can look in loaders.py for more info, but the important thing here is that it returns a DbDeviceInfo object for our device. This is a simple database information class.

In this case, "alias" is directly available on this object, which means it will automatically get picked up by Ariadne and returned. Done!

Let's look for some slightly more complicated Device field such as connected; search the code for ObjectType("Device") to find where it's defined, and then look for the resolver function for the "connected" field, like above. You should find a function that gets our DbDeviceInfo object and does some work to check if the device responds, and returns a boolean depending on that. Done.

With this knowledge, you should be able to navigate the code by looking for the correct object types and resolvers, i.e. a resolver can return a new object, which gets resolved, and so on. The chain can go many steps deep but the pattern is the same.

Running Unit Tests in Docker for TangoGQL

To run unit tests in Docker for the TangoGQL project, follow these steps. These instructions assume you are using PowerShell:

Steps to Run Unit Tests in Docker

  1. Build the Docker Image
    Open PowerShell and build the Docker image from the project directory:

    docker build -t tangogql .
    
  2. Run the container

    docker run --rm -it tangogql bash
    
  3. Install test Dependencies Install the test dependencies for TangoGQL (TangoGQL itself is already installed)

    uv pip install ".[tests]"
    
  4. Run Unit Tests Finally, run the test suite:

    pytest
    

These steps will set up and run the unit tests within the Docker container.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tangogql-2.2.4.tar.gz (86.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tangogql-2.2.4-py3-none-any.whl (58.6 kB view details)

Uploaded Python 3

File details

Details for the file tangogql-2.2.4.tar.gz.

File metadata

  • Download URL: tangogql-2.2.4.tar.gz
  • Upload date:
  • Size: 86.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for tangogql-2.2.4.tar.gz
Algorithm Hash digest
SHA256 8ee4818e8201e403719f03133473e78bcfc667b3155e82b412045cd834e8c762
MD5 c89fa6948a53cd794b19953f08c08236
BLAKE2b-256 df0b32d25ad1555a7eaa758d4fd30eabce292aa502c6d75120fdc2c89c90d430

See more details on using hashes here.

File details

Details for the file tangogql-2.2.4-py3-none-any.whl.

File metadata

  • Download URL: tangogql-2.2.4-py3-none-any.whl
  • Upload date:
  • Size: 58.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for tangogql-2.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 f8421cf4ed12be016778194f28161401a2162a2d16aa0b3d5ade7b1284b876fd
MD5 328ab40c67c2a34a323a13edd6a1459a
BLAKE2b-256 73d47997e2e7292183fe4f3dae75706a1eed47cccec1ae89eb45f52c6cad8aef

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page