Skip to main content

SQLAlchemy dialect for OGC WFS

Project description

Coverage Reliability Rating Maintainability Rating Security Rating

SQLAlchemy dialect for OGC WFS

SQLAlchemy dialect for OGC WFS as a Superset plugin.

Register the dialect

Create a requirements-local.txt file according to the superset documentation and insert following line:

superset_wfs_dialect

The dialect must then be registered in your superset config file, e.g. superset_config_docker.py when using the docker setup:

from sqlalchemy.dialects import registry
registry.register("wfs", "superset_wfs_dialect.dialect", "WfsDialect")

Start/restart superset and continue as described in the Start the application section.

Add a WFS database connection:

  • select Data > Connect database in the submenu
  • choose "Other" at the list of "Supported Databases"
  • insert the SQLAlchemy URI to a WFS wfs://[...] (i.e. replace https:// of your WFS URL with wfs://)
  • test the connection
  • create a dataset
  • create a chart/dashboard

Development

Prerequisites for development

  • Docker Engine >= version 28
  • python >= version 3.10.12
  • Checkout this project

Installation

For debugging and code completion run via terminal within the project root:

python3 -m venv .venv
source .venv/bin/activate
pip install -e .

or create a virtual environment via VS Code:

https://code.visualstudio.com/docs/python/python-tutorial#_create-a-virtual-environment.

Start superset with the registered plugin:

docker compose up -d --build

Debugging during development

Debugging can be activated via the VS Code during development using F5. Please note that the Python interpreter is selected from the previously created venv. Breakpoints set in VS Code are then taken into account.

Start the application

When in development mode, open http://localhost:8088/ . Otherwise, please open the corresponding URL to the installed superset instance.

Publishing a Development Version to PyPI

Requirements

  • You must be on the main branch
  • Your working directory must be clean (no uncommitted changes)
  • You have push access to the repository
  • A valid PYPI_TOKEN is configured in GitHub Secrets (used by the GitHub Actions workflow)

Releasing a new version

  1. Run the release script with the desired version number (e.g. 0.0.1):

    ./release.sh 0.0.1
    

    This will:

    • Update the version field in setup.py
    • Commit the change to main
    • Create a Git tag e.g. 0.0.1
    • Push the tag to GitHub
  2. The GitHub Actions workflow will be triggered by the tag:

    • It will build the package
    • Upload it to PyPI

Notes

  • Versions must follow the format X.Y.Z (e.g. 0.1.0)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

superset_wfs_dialect-0.0.2.tar.gz (19.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

superset_wfs_dialect-0.0.2-py3-none-any.whl (20.4 kB view details)

Uploaded Python 3

File details

Details for the file superset_wfs_dialect-0.0.2.tar.gz.

File metadata

  • Download URL: superset_wfs_dialect-0.0.2.tar.gz
  • Upload date:
  • Size: 19.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.17

File hashes

Hashes for superset_wfs_dialect-0.0.2.tar.gz
Algorithm Hash digest
SHA256 d0479fe7a0fc330545eed775a834234deb76d3a28c15e38aafac7477bfb8a3b0
MD5 fce829518dba510f059c6bc2c6f4dc90
BLAKE2b-256 74aff1228846a43b1691341dc2830dd738cc0c10f7573d7841a16bdf165014d6

See more details on using hashes here.

File details

Details for the file superset_wfs_dialect-0.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for superset_wfs_dialect-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 8c78375c57fcdfe4d2735c26b59509da01355d49f55f17b657ba6eee633ce792
MD5 a8f0223053e13360a6bb7a5035bdb4e8
BLAKE2b-256 ca1ac2b36e9789c6edb37c67b660a3108eecd4126e8003e0a66de00eb607961f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page