Skip to main content

No project description provided

Project description

webknossos-connect

A webKnossos compatible data connector written in Python.

webKnossos-connect serves as an adapter between the webKnossos data store interface and other alternative data storage servers (e.g BossDB) or static files hosted on Cloud Storage (e.g. Neuroglancer Precomputed)

Github Actions

Available Adapaters / Supported Data Formats:

Usage

1. Installation / Docker

Install webKnossos-connect using Docker or use the instructions for native installation below. docker-compose up --build webknossos-connect

2. Connecting to webKnossos

Register your webknossos-connect instance with your main webKnossos instance. Modify the webKnossos Postgres database:

INSERT INTO "webknossos"."datastores"("name","url","publicurl","key","isscratch","isdeleted","isforeign","isconnector")
VALUES (E'connect', E'http://localhost:8000', E'http://localhost:8000', E'secret-key', FALSE, FALSE, FALSE, TRUE);

3. Adding Datasets

Add and configure datasets to webKnossos-connect to make them available for viewing in webKnossos

3.1 REST API

You can add new datasets to webKnossos-connect through the REST interface. POST a JSON configuration to:

http://<webKnossos-connect>/data/datasets?token

The access token can be obained from your user profile in the webKnossos main instance. Read more in the webKnosssos docs.

Example JSON body. More examples can be found here.

{
    "boss": {
        "Test Organisation": {
            "ara": {
                "domain": "https://api.boss.neurodata.io",
                "collection": "ara_2016",
                "experiment": "sagittal_50um",
                "username": "<NEURODATA_IO_USER>",
                "password": "<NEURODATA_IO_PW>"
            },
        }
    },
    "neuroglancer": {
        "Test Organisation": {
            "fafb_v14": {
                "layers": {
                    "image": {
                        "source": "gs://neuroglancer-fafb-data/fafb_v14/fafb_v14_clahe",
                        "type": "image"
                    }
                }
            }
        }
    },
    "tiff": {
        "Test Organization": {
            "my_2d_tiff_dataset": {
                "scale": [2.1,2.1]
            }
        }
    }
}

Note that tiff datasets are hosted locally. Create compatible tifs with vips tiffsave source.tif color.tif --tile --pyramid --bigtiff --compression none --tile-width 256 --tile-height 256 and save the generated color.tif file at data/binary/sample_organization/my_2d_tiff_dataset.

CURL Example

curl http:/<webKnossos-connect>/data/datasets -X POST -H "Content-Type: application/json" --data-binary "@datasets.json"

3.2 webKnossos UI

Alternatively, new datasets can be added directly through the webKnossos UI. Configure and import a new datasets from the webKnossos dashboard. (Dashboard -> Datasets -> Upload Dataset -> Add wk-connect Dataset)

Read more in the webKnossos docs.

3.3 Default test datasets

By default, some public datasets are added to webKnossos-connect to get you started when using the Docker image.

Development

In Docker :whale:

  • Start it with docker-compose up dev
  • Run other commands docker-compose run --rm dev pipenv run lint
  • Check below for moar commands.
  • If you change the packages, rebuild the image with docker-compose build dev

Native

Installation

You need Python 3.8 with poetry installed.

pip install poetry
poetry install

Run

  • Add webknossos-connect to the webKnossos database:
    INSERT INTO "webknossos"."datastores"("name","url","publicurl","key","isscratch","isdeleted","isforeign","isconnector")
    VALUES (E'connect', E'http://localhost:8000', E'http://localhost:8000', E'secret-key', FALSE, FALSE, FALSE, TRUE);
    
  • python -m wkconnect
  • curl http://localhost:8000/api/neuroglancer/Demo_Lab/test \
      -X POST -H "Content-Type: application/json" \
      --data-binary "@datasets.json"
    

Moar

Useful commands:

  • Lint with pylint & flake8
  • Format with black, isort & autoflake
  • Type-check with mypy
  • Benchark with timeit
  • Trace with py-spy

Use the commands:

  • scripts/pretty.sh
  • scripts/pretty-check.sh
  • scripts/lint.sh
  • scripts/type-check.sh
  • benchmarks/run_all.sh

Trace the server on http://localhost:8000/trace.

License

AGPLv3 Copyright scalable minds

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

wkconnect-21.8.0.tar.gz (56.9 kB view details)

Uploaded Source

File details

Details for the file wkconnect-21.8.0.tar.gz.

File metadata

  • Download URL: wkconnect-21.8.0.tar.gz
  • Upload date:
  • Size: 56.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.8 CPython/3.8.11 Linux/5.8.0-1039-azure

File hashes

Hashes for wkconnect-21.8.0.tar.gz
Algorithm Hash digest
SHA256 bed43966500fc9f12bea2c612761ba8e1967042cbe9a4779015ab7600d9bd625
MD5 f7a35eb313c7a85ae3f6754f58daae41
BLAKE2b-256 16c481f2bf7b99e52a3ecc13b968d7db5bbb89890cdaf8ed3be0a810c7ade8aa

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page