Image Classifier optimised for ecology use-cases
Project description
# Lit Ecology Classifier
Lit Ecology Classifier is a machine learning project designed for image classification tasks. It leverages PyTorch Lightning for streamlined training and evaluation processes.
## Features
Easy configuration and setup
Utilizes PyTorch Lightning for robust training and evaluation
Supports training on multiple GPUs
Test Time Augmentation (TTA) for enhanced evaluation
Integration with Weights and Biases for experiment tracking
## Installation
To install Lit Ecology Classifier, use pip:
`bash pip install lit-ecology-classifier `
## Usage
### Training
To train the model, use the following command:
`bash python -m lit_ecology_classifier.main --max_epochs 20 --dataset phyto --priority config/priority.json `
### Inference
To run inference on unlabelled data, use the following command:
`bash python -m lit_ecology_classifier.predict --datapath /path/to/data.tar --model_path /path/to/model.ckpt --outpath ./predictions/ `
## Configuration
The project uses an argument parser for configuration. Here are some of the key arguments:
### Training Arguments
–datapath: Path to the tar file containing the training data.
–train_outpath: Output path for training artifacts.
–main_param_path: Main directory where the training parameters are saved.
–dataset: Name of the dataset.
–use_wandb: Use Weights and Biases for logging.
–priority_classes: Path to the JSON file with priority classes.
–balance_classes: Balance the classes for training.
–batch_size: Batch size for training.
–max_epochs: Number of epochs to train.
–lr: Learning rate for training.
–lr_factor: Learning rate factor for training of full body.
–no_gpu: Use no GPU for training.
### Inference Arguments
–outpath: Directory where predictions are saved.
–model_path: Path to the model file.
–datapath: Path to the tar file containing the data to classify.
–no_gpu: Use no GPU for inference.
–no_TTA: Disable test-time augmentation.
## Documentation
Detailed documentation for this project is available at [Read the Docs](https://lit-ecology-classifier.readthedocs.io).
### Example SLURM Job Submission Script
Here is an example SLURM job submission script for training on multiple GPUs:
`bash #!/bin/bash #SBATCH --account="em09" #SBATCH --constraint='gpu' #SBATCH --nodes=2 #SBATCH --ntasks-per-core=1 #SBATCH --ntasks-per-node=1 #SBATCH --cpus-per-task=12 #SBATCH --partition=normal #SBATCH --constraint=gpu #SBATCH --hint=nomultithread #SBATCH --output=slurm/slurm_%j.out #SBATCH --error=slurm/slurm_%j.err export OMP_NUM_THREADS=12 #$SLURM_CPUS_PER_TASK cd ${SCRATCH}/lit_ecology_classifier module purge module load daint-gpu cray-python source lit_ecology/bin/activate python -m lit_ecology_classifier.main --max_epochs 2 --dataset phyto --priority config/priority.json `
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file lit_ecology_classifier-0.1.tar.gz
.
File metadata
- Download URL: lit_ecology_classifier-0.1.tar.gz
- Upload date:
- Size: 16.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.9.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4108645a8c83c5be6ccb44f4cc325458a7c59e1228b10768a6902a17d0a0bab1 |
|
MD5 | f4b6756303604303a129a6acae62cd2a |
|
BLAKE2b-256 | 64ffbe5593a03821b9bf0eeabd937d2b85fab82d879323701c560f5a6b934263 |
File details
Details for the file lit_ecology_classifier-0.1-py3-none-any.whl
.
File metadata
- Download URL: lit_ecology_classifier-0.1-py3-none-any.whl
- Upload date:
- Size: 20.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.9.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2198ea4dd8e45c747198b644b3540a1c3e856dcc19ea3f4bdaab3e4b3528b4b9 |
|
MD5 | e9cef97c3c11faa8e5bd2ecf1428cef0 |
|
BLAKE2b-256 | 4ab19d3d10a40e02a289e96f624ca1c45e314cf0a65115f51570f29d6ce9f7ad |