An analysis environment for satellite and other earth observation data
Project description
Open Data Cube Core
Overview
The Open Data Cube Core provides an integrated gridded data analysis environment for decades of analysis ready earth observation satellite and related data from multiple satellite and other acquisition systems.
Documentation
See the user guide for installation and usage of the datacube, and for documentation of the API.
Join our Discord if you need help setting up or using the Open Data Cube.
Please help us to keep the Open Data Cube community open and inclusive by reading and following our Code of Conduct.
This is a 1.9.x series release of the Open Data Cube. If you are migrating from a 1.8.x series release, please refer to the 1.8.x to 1.9.x Migration Notes.
Requirements
System
PostgreSQL 10+
Python 3.9+
Developer setup
Clone:
git clone https://github.com/opendatacube/datacube-core.git
Create a Python environment for using the ODC. We recommend Mambaforge as the easiest way to handle Python dependencies.
mamba env create -f conda-environment.yml conda activate cubeenv
Install a develop version of datacube-core.
cd datacube-core pip install --upgrade -e .
Install the pre-commit hooks to help follow ODC coding conventions when committing with git.
pre-commit install
Run unit tests + PyLint
Install test dependencies using:
pip install --upgrade -e '.[test]'
If install for these fails, please lodge them as issues.
Run unit tests with:
./check-code.sh
(this script approximates what is run by GitHub Actions. You can alternatively run pytest yourself).
(or) Run all tests, including integration tests.
./check-code.sh integration_tests
Assumes a password-less Postgres database running on localhost called
pgintegration
Otherwise copy integration_tests/integration.conf to ~/.datacube_integration.conf and edit to customise.
- For instructions on setting up a password-less Postgres database, see
Alternatively one can use the opendatacube/datacube-tests docker image to run tests. This docker includes database server pre-configured for running integration tests. Add --with-docker command line option as a first argument to ./check-code.sh script.
./check-code.sh --with-docker integration_tests
To run individual tests in a docker container
docker build --tag=opendatacube/datacube-tests-local --no-cache --progress plain -f docker/Dockerfile . docker run -ti -v $(pwd):/code opendatacube/datacube-tests-local:latest pytest integration_tests/test_filename.py::test_function_name
Developer setup on Ubuntu
Building a Python virtual environment on Ubuntu suitable for development work.
Install dependencies:
sudo apt-get update sudo apt-get install -y \ autoconf automake build-essential make cmake \ graphviz \ python3-venv \ python3-dev \ libpq-dev \ libyaml-dev \ libnetcdf-dev \ libudunits2-dev
Build the python virtual environment:
pyenv="${HOME}/.envs/odc" # Change to suit your needs mkdir -p "${pyenv}" python3 -m venv "${pyenv}" source "${pyenv}/bin/activate" pip install -U pip wheel cython numpy pip install -e '.[dev]' pip install flake8 mypy pylint autoflake black
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file datacube-1.9.0rc11.tar.gz
.
File metadata
- Download URL: datacube-1.9.0rc11.tar.gz
- Upload date:
- Size: 2.1 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 058cab8702269540f025fb9228e0e33203245385b36b135ec94d9262ae9dbbe4 |
|
MD5 | 321331b78aebe68e8b4011fef545c828 |
|
BLAKE2b-256 | 3daeb4b619a17979a65bb6945a81b5a1122332af63f05959fd54cb94e6098f67 |
File details
Details for the file datacube-1.9.0rc11-py2.py3-none-any.whl
.
File metadata
- Download URL: datacube-1.9.0rc11-py2.py3-none-any.whl
- Upload date:
- Size: 409.7 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 93a540889ef6fdc5e78466c535de8a999e8f1f6d828cbd190326af7c14e8b9ec |
|
MD5 | 2a2ca24aa1b73822932c7f7ebcdaa31b |
|
BLAKE2b-256 | ec0f0ec80e5fcad4e5ff3d0cdbc23938cbf2809953ecffe481e4fda5b5ae36ac |