Skip to main content

A service to put and get your config values from

Project description

Frontend CI Backend CI Coverage PyPI License

DAQ Config Server

A service to put and get your config values from.

Comprises a FastAPI backend with a valkey database to C/R/U/D config values, and a Chakra/React frontend for easier management.

Currently the scope is JUST storing and retrieving feature flags for Hyperion/I03 UDC but we hope to expand this to replace all DAQ config files.

Source https://github.com/DiamondLightSource/daq-config-server
Docker docker run ghcr.io/DiamondLightSource/daq-config-server:latest
Releases https://github.com/DiamondLightSource/daq-config-server/releases

A simple app for storing and fetching values. Has a Valkey (Redis) instance as well as options for file-backed legacy values (e.g. beamlineParameters...)

Currently the server application always needs to be run with the --dev flag, as it cannot yet look at the DLS filesystem to find the real beamline parameter files.

To use the config values in an experimental application (e.g. Hyperion) you can do:

from daq_config_server.client import ConfigServer

config_server = ConfigServer("<service ip address>", <port>)

use_stub_offsets: bool = config_server.best_effort_get_feature_flag("use_stub_offsets")

To work on the GUI you will probably need to run:

module load node
npm install

in the gui directory to setup the environment.

Testing and deployment

There is a convenient script in ./deployment/build_and_push_all.sh to build all the containers, which takes a --dev option to push containers with -dev appended to their names and a --no-push option for local development. This ensures that environment variables for dev or prod builds are included in the built container, such as the GUI pointing at the subdomain URL vs. localhost, and the root_path of the FastAPI app. To push to the registry you must have identified to gcloud by having loaded a kubernetes module and running gcloud auth login.

To deploy a live version, you can run the above script with no arguments and then while logged in to argus, in the daq-config-server namespace, run kubectl rollout restart deployment. If it is not currently deploy it you can deploy it with helm install daq-config ./helmchart.

To test locally, you can build everything with ./deployment/build_and_push_all.sh --dev --no-push and then run the containers daq-config-server-dev (with the command daq-config-server --dev), daq-config-server-db-dev, and daq-config-server-gui-dev, all with the --net host option.

To test on pollux, log in to pollux in your namespace and run:

helm install daq-config ./helmchart/ --values dev-values.yaml

followed by:

kubectl port-forward service/daq-config-server-svc 8080 8555

after which you should be able to access the frontend on http://localhost:8080 and the API on http://localhost:8555

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

daq_config_server-0.1.1.tar.gz (173.4 kB view details)

Uploaded Source

Built Distribution

daq_config_server-0.1.1-py3-none-any.whl (17.1 kB view details)

Uploaded Python 3

File details

Details for the file daq_config_server-0.1.1.tar.gz.

File metadata

  • Download URL: daq_config_server-0.1.1.tar.gz
  • Upload date:
  • Size: 173.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.5

File hashes

Hashes for daq_config_server-0.1.1.tar.gz
Algorithm Hash digest
SHA256 ed3f5052670b883eb930941aaf46c93e113a32f0c4c5fcf1cb503358fe9f43b4
MD5 1992936194c4e6a27eff060c8832ef03
BLAKE2b-256 35d25ddcf69e05120c6494786388e7f66e7fbaeb18a012696d875612e9e5a5d9

See more details on using hashes here.

File details

Details for the file daq_config_server-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for daq_config_server-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 cd2b9eb6ff1a4636841fe1b51ba36cb39a3a0ead2b1636b6d1be14370a33b5d5
MD5 a8cb5432521b1e6aac7157254d9f648e
BLAKE2b-256 44d1e941d2bf3d5838dff7ebaa0a4ca8004a978df36dde1bb240a228c39f4167

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page