Central storage, management and access for important geospatial datasets developed by LINZ
Project description
Geostore
LINZ central storage, management and access solution for important geospatial datasets. Developed by Land Information New Zealand.
Prerequisites
Geostore VPC
A Geostore VPC must exist in your AWS account before deploying this application. AT LINZ, VPCs are managed internally by the IT team. If you are deploying this application outside LINZ, you will need to create a VPC with the following tags:
- "ApplicationName": "geostore"
- "ApplicationLayer": "networking"
You can achieve this by adding the networking_stack
(infrastructure/networking_stack.py)
into
app.py
before deployment as a dependency of application_stack
(infrastructure/application_stack.py
).
Verify infrastructure settings
This infrastructure by default includes some Toitū Te Whenua-/LINZ-specific parts, controlled by
settings in cdk.json. To disable these, simply remove the context entries or set them to false
.
The settings are:
enableLDSAccess
: if true, gives LINZ Data Service/Koordinates read access to the storage bucket.enableOpenTopographyAccess
: if true, gives OpenTopography read access to the storage bucket.
Development setup
One-time setup which generally assumes that you're in the project directory.
Common
- Install Docker
- Configure Docker:
- Add yourself to the "docker" group:
sudo usermod --append --groups=docker "$USER"
- Log out and back in to enable the new group
- Add yourself to the "docker" group:
Ubuntu
-
Install
nvm
:cd "$(mktemp --directory)" wget https://raw.githubusercontent.com/nvm-sh/nvm/master/install.sh echo 'b674516f001d331c517be63c1baeaf71de6cbb6d68a44112bf2cff39a6bc246a install.sh' | sha256sum --check && bash install.sh
-
Install Poetry:
cd "$(mktemp --directory)" wget https://raw.githubusercontent.com/python-poetry/poetry/master/install-poetry.py echo 'b35d059be6f343ac1f05ae56e8eaaaebb34da8c92424ee00133821d7f11e3a9c install-poetry.py' | sha256sum --check && python3 install-poetry.py
-
Install Pyenv:
sudo apt-get update sudo apt-get install --no-install-recommends build-essential curl libbz2-dev libffi-dev liblzma-dev libncurses5-dev libreadline-dev libsqlite3-dev libssl-dev libxml2-dev libxmlsec1-dev llvm make tk-dev wget xz-utils zlib1g-dev cd "$(mktemp --directory)" wget https://github.com/pyenv/pyenv-installer/raw/master/bin/pyenv-installer echo '3aa49f2b3b77556272a80a01fe44d46733f4862dbbbc956002dc944c428bebd8 pyenv-installer' | sha256sum --check && bash pyenv-installer
-
Enable the above by adding the following to your
~/.bashrc
:if [[ -e "${HOME}/.local/bin" ]] then PATH="${HOME}/.local/bin:${PATH}" fi # nvm <https://github.com/nvm-sh/nvm> if [[ -d "${HOME}/.nvm" ]] then export NVM_DIR="${HOME}/.nvm" # shellcheck source=/dev/null [[ -s "${NVM_DIR}/nvm.sh" ]] && . "${NVM_DIR}/nvm.sh" # shellcheck source=/dev/null [[ -s "${NVM_DIR}/bash_completion" ]] && . "${NVM_DIR}/bash_completion" fi # Pyenv <https://github.com/pyenv/pyenv> if [[ -e "${HOME}/.pyenv" ]] then PATH="${HOME}/.pyenv/bin:${PATH}" eval "$(pyenv init --path)" eval "$(pyenv init -)" eval "$(pyenv virtualenv-init -)" fi
-
Configure Docker:
- Add yourself to the "docker" group:
sudo usermod --append --groups=docker "$USER"
- Log out and back in to enable the new group
- Add yourself to the "docker" group:
-
Install project Node.js:
nvm install
-
Run
./reset-dev-env.bash --all
to install packages. -
Enable the dev environment:
. activate-dev-env.bash
. -
Optional: Enable Dependabot alerts by email. (This is optional since it currently can't be set per repository or organisation, so it affects any repos where you have access to Dependabot alerts.)
-
Install
aws-azure-login
.
Re-run ./reset-dev-env.bash
when packages change. One easy way to use it pretty much seamlessly is
to run it before every workday, with a crontab entry like this template:
HOME='/home/USERNAME'
0 2 * * 1-5 export PATH="${HOME}/.pyenv/shims:${HOME}/.pyenv/bin:${HOME}/.poetry/bin:/root/bin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/run/current-system/sw/bin" && cd "PATH_TO_GEOSTORE" && ./reset-dev-env.bash --all
Replace USERNAME
and PATH_TO_GEOSTORE
with your values, resulting in something like this:
HOME='/home/jdoe'
0 2 * * 1-5 export PATH="${HOME}/.pyenv/shims:${HOME}/.pyenv/bin:${HOME}/.poetry/bin:/root/bin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/run/current-system/sw/bin" && cd "${HOME}/dev/geostore" && ./reset-dev-env.bash --all
Re-run . activate-dev-env.bash
in each shell.
Nix
- Run
nix-shell
. - Optional: Install and configure
direnv
anddirenv allow .
to load the Nix shell whenever youcd
into the project.
Restart your nix-shell
when packages change.
When setting up the project SDK point it to .run/python
, which is a symlink to the latest Nix
shell Python executable.
Optional
Enable Dependabot alerts by email. (This is optional since it currently can't be set per repository or organisation, so it affects any repos where you have access to Dependabot alerts.)
AWS Infrastructure deployment
-
Get AWS credentials (see: https://www.npmjs.com/package/aws-azure-login) for 12 hours:
aws-azure-login --no-prompt --profile=<AWS-PROFILE-NAME>
-
Environment variables
-
GEOSTORE_ENV_NAME
: set deployment environment. For your personal development stack: set GEOSTORE_ENV_NAME to your username.export GEOSTORE_ENV_NAME="$USER"
Other values used by CI pipelines include: prod, nonprod, ci, dev or any string without spaces. Default: test.
RESOURCE_REMOVAL_POLICY
: determines if resources containing user content like Geostore Storage S3 bucket or application database tables will be preserved even if they are removed from stack or stack is deleted. Supported values:- DESTROY: destroy resource when removed from stack or stack is deleted (default)
- RETAIN: retain orphaned resource when removed from stack or stack is deleted
GEOSTORE_SAML_IDENTITY_PROVIDER_ARN
: SAML identity provider AWS ARN.
-
-
Bootstrap CDK (only once per profile)
cdk --profile=<AWS-PROFILE-NAME> bootstrap aws://unknown-account/ap-southeast-2
-
Deploy CDK stack
cdk --profile=<AWS-PROFILE-NAME> deploy --all
Once comfortable with CDK you can add
--require-approval=never
above to deploy non-interactively.
If you export AWS_PROFILE=<AWS-PROFILE-NAME>
you won't need the --profile=<AWS-PROFILE-NAME>
arguments above.
Development
Adding or updating Python dependencies
To add a development-only package: poetry add --dev PACKAGE='*'
To add a production package:
- Install the package using
poetry add --optional PACKAGE='*'
. - Put the package in alphabetical order within the list.
- Mention the package in the relevant lists in
[tool.poetry.extras]
.
-
Make sure to update packages separately from adding packages. Basically, follow this process before running
poetry add
, and do the equivalent when updating Node.js packages or changing Docker base images:- Check out a new branch on top of origin/master:
git checkout -b update-python-packages origin/master
. - Update the Python packages:
poetry update
. The rest of the steps are only necessary if this step changes poetry.lock. Otherwise you can just change back to the original branch and delete "update-python-packages". - Commit, push and create pull request.
- Check out the branch where you originally wanted to run
poetry add
. - Rebase the branch onto the package update branch:
git rebase update-python-packages
.
At this point any
poetry add
commands should not result in any package updates other than those necessary to fulfil the new packages' dependencies.Rationale: Keeping upgrades and other packages changes apart is useful when reading/bisecting history. It also makes code review easier.
- Check out a new branch on top of origin/master:
-
When there's a merge conflict in poetry.lock, first check whether either or both commits contain a package upgrade:
- If neither of them do, simply
git checkout --ours -- poetry.lock && poetry lock --no-update
. - If one of them does, check out that file (
git checkout --ours -- poetry.lock
orgit checkout --theirs -- poetry.lock
) and runpoetry lock --no-update
to regeneratepoetry.lock
with the current package versions. - If both of them do, manually merge
poetry.lock
and runpoetry lock --no-update
.
Rationale: This should avoid accidentally down- or upgrading when resolving a merge conflict.
- If neither of them do, simply
-
Update the code coverage minimum in pyproject.toml and the badge above on branches which increase it.
Rationale: By updating this continuously we avoid missing test regressions in new branches.
Upgrading Python version
To minimise the chance of discrepancies between environments it is important to run the same (or as close as possible) version of Python in the development environment, in the pipeline, and in deployed instances. At the moment the available versions are constrained by the following:
- The Ubuntu packages used in the Dockerfile
- The AWS base images used as Lambda runtimes
- The pyenv versions used for local development
- The supported Poetry versions used for all dependencies
When updating Python versions you have to check that all of the above can be kept at the same minor version, and ideally at the same patch level.
Running tests
Prerequisites:
- Authenticated to a profile which has access to a deployed Geostore.
To launch full test suite, run pytest
.
Debugging
To start debugging at a specific line, insert import ipdb; ipdb.set_trace()
.
To debug a test run, add --capture=no
to the pytest
arguments. You can also automatically start
debugging at a test failure point with --pdb --pdbcls=IPython.terminal.debugger:Pdb
.
Upgrading CI runner
jobs.<job_id>.runs-on
in .github sets the runner type per job. We should make sure all of these use the latest specific
("ubuntu-YY.MM" as opposed to "ubuntu-latest") Ubuntu LTS version, to make sure the version changes
only when we're ready for it.
GitHub Actions cache clearing
To throw away the current cache (for example in case of a cache corruption), simply change the
CACHE_SEED
repository "secret",
for example to the current timestamp (date +%s
). Subsequent jobs will then ignore the existing
cache.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file geostore-0.1.6.tar.gz
.
File metadata
- Download URL: geostore-0.1.6.tar.gz
- Upload date:
- Size: 197.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.1.6 CPython/3.8.9 Linux/5.10.76
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 70ac6a207e86e7aed8ac7c8994b0c643c813842da3dcdf93ed20563db65a2566 |
|
MD5 | 23153321037d1711d20f74effc117039 |
|
BLAKE2b-256 | 6ac9e8761cb6918ed510b2b37e139ec1de050f1431ddc26002e13e73fe844c22 |
File details
Details for the file geostore-0.1.6-py3-none-any.whl
.
File metadata
- Download URL: geostore-0.1.6-py3-none-any.whl
- Upload date:
- Size: 434.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.1.6 CPython/3.8.9 Linux/5.10.76
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 390c47f85de5a68b5e65feddb724c51eee185e502ad437b3cae25830c241d830 |
|
MD5 | 719f341353bc22f6b2e27b2689403b5c |
|
BLAKE2b-256 | bcf6746eb8f13d2a2cce2fbbe69e39f3b3abccb7ae61647106c5906619576e37 |