Tree crown segmentation and analysis in remote sensing imagery with PyTorch
Project description
DeepTrees 🌳
Tree Crown Segmentation and Analysis in Remote Sensing Imagery with PyTorchInstallation
To install the package, clone the repository and install the dependencies.
git clone https://codebase.helmholtz.cloud/ai-consultants-dkrz/DeepTrees.git
cd DeepTrees
## create a new conda environment
conda create --name deeptree
conda activate deeptree
conda install -c conda-forge gdal==3.9.2 pip
pip install -r requirements.txt
or from pip.
pip install deeptrees
Documentation
This library is documented using Sphinx. To build the documentation, run the following command.
sphinx-apidoc -o docs/source deeptrees
cd docs
make html
This will create the documentation in the docs/build directory. Open the index.html file in your browser to view the documentation.
Predict on a list of images
Run the inference script with the corresponding config file on list of images.
from deeptrees import predict
predict(image_path=["list of image_paths"], config_path = "config_path")
Scripts
Preprocessing
Expected Directory structure
The root folder is /work/ka1176/shared_data/2024-ufz-deeptree/polygon-labelling/. Sync the folder tiles and labels with the labeled tiles provided by UFZ. The unlabeled tiles go into pool_tiles.
|-- tiles
| |-- tile_0_0.tif
| |-- tile_0_1.tif
| |-- ...
|-- labels
| |-- label_tile_0_0.shp
| |-- label_tile_0_1.shp
| |-- ...
|-- pool_tiles
| |-- tile_4_7.tif
| |-- tile_4_8.tif
| |-- ...
Create the new empty directories
|-- masks
|-- outlines
|-- dist_trafo
Training
Adapt your own config file based on the defaults in train_halle.yaml as needed. For inspiration for a derived config file for finetuning, check finetune_halle.yaml.
Run the script like this:
python scripts/train.py # this is the default config that trains from scratch
python scripts/train.py --config-name=finetune_halle # finetune with pretrained model
python scripts/train.py --config-name=yourconfig # with your own config
To re-generate the ground truth for training, make sure to pass the label directory in data.ground_truth_labels. To turn it off, pass data.ground_truth_labels=null.
You can overwrite individual parameters on the command line, e.g.
python scripts/train.py trainer.fast_dev_run=True
To resume training from a checkpoint, take care to pass the hydra arguments in quotes to avoid the shell intercepting the string (pretrained model contains =):
python scripts/train.py 'model.pretrained_model="Unet-resnet18_epochs=209_lr=0.0001_width=224_bs=32_divby=255_custom_color_augs_k=0_jitted.pt"'
Training Logs
View the MLFlow logs that were created during training.
TODO
Inference
Run the inference script with the corresponding config file. Adjust as needed.
python scripts/test.py --config-name=inference_halle
Semantic Versioning
This reposirotry has auto semantic versionining enabled. To create new releases, we need to merge into the default finetuning-halle branch.
Semantic Versionining, or SemVer, is a versioning standard for software (SemVer website). Given a version number MAJOR.MINOR.PATCH, increment the:
- MAJOR version when you make incompatible API changes
- MINOR version when you add functionality in a backward compatible manner
- PATCH version when you make backward compatible bug fixes
- Additional labels for pre-release and build metad
See the SemVer rules and all possible commit prefixes in the .releaserc.json file.
| Prefix | Explanation | Example |
|---|---|---|
| feat | A new feature was implemented as part of the commit, so the Minor part of the version will be increased once this is merged to the main branch |
feat: model training updated |
| fix | A bug was fixed, so the Patch part of the version will be increased once this is merged to the main branch |
fix: fix a bug that causes the user to not be properly informed when a job finishes |
The implementation is based on. https://mobiuscode.dev/posts/Automatic-Semantic-Versioning-for-GitLab-Projects/
License
This repository is licensed under the MIT License. For more information, see the LICENSE.md file.
Cite as
@article{khan2025torchtrees,
author = {Taimur Khan and Caroline Arnold and Harsh Grover},
title = {DeepTrees: Tree Crown Segmentation and Analysis in Remote Sensing Imagery with PyTorch},
journal = {arXiv},
year = {2025},
archivePrefix = {arXiv},
eprint = {XXXXX.YYYYY},
primaryClass = {cs.CV}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file deeptrees-1.6.0.tar.gz.
File metadata
- Download URL: deeptrees-1.6.0.tar.gz
- Upload date:
- Size: 48.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.16
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c12e7e907c3435cd899efa1638c05464004e3e401b387702ce91952f97ad68ff
|
|
| MD5 |
d8ae795b57e4c6651fa50c90ac66f556
|
|
| BLAKE2b-256 |
90924d69815ee7e2693e58b2f5f609ef56e3d83ca7bf720f537e81a74e33fd1f
|
File details
Details for the file deeptrees-1.6.0-py3-none-any.whl.
File metadata
- Download URL: deeptrees-1.6.0-py3-none-any.whl
- Upload date:
- Size: 54.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.16
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
86de77a869b6ac1af78aeceee112c34fca6ddd813f12cc78128aecffe05afe4e
|
|
| MD5 |
c2f6f21f47cef428cba4acef50dcb88f
|
|
| BLAKE2b-256 |
ebd4bedbb674036bd34f1b1e299121df202801d99feb0359a5b68744f911776a
|