Hierarchical Multi-Label Classification Network in Pytorch
Project description
HMC Torch Project Summary
Overview
This document summarizes the HMC Torch project, a hierarchical multi-label classification network implemented in PyTorch. The project is currently at version 0.0.1 and includes various Jupyter notebooks, scripts, and configuration files.
Key Updates
- Version: 0.0.1
- Main Changes:
- Removed the main function from the training file to simplify the code structure.
- Updated
.gitignoreto improve version control management.
Project Structure
This section outlines the key components of the project to help users navigate the codebase. The project contains the following key files and directories:
- Notebooks:
Dataset.ipynb: Handles dataset loading and preprocessing.Executer-model.ipynb: Contains the model execution logic.Inference.ipynb: Used for making predictions with the trained model.
- Scripts:
executer.py: Core execution script for the model.
- Configuration:
pyproject.toml: Project configuration file.poetry.lock: Dependency lock file.
- Documentation:
README.md: Provides an overview and instructions for the project.LICENSE: Licensing information for the project.
Prerequisites Installation
Before setting up the project, ensure you have the following prerequisites installed and configured:
1. Create a Virtual Environment
It is recommended to use a virtual environment to manage dependencies. Run the following command to create one:
python -m venv .venv
Activation Steps:
- Command Prompt (Windows):
source .\.venv\Scripts\activate
- PowerShell (Windows):
.\.venv\Scripts\Activate.ps1
- Linux/MacOS:
source .venv/bin/activate
2. Install Poetry
Poetry is used for dependency management. Install it using pip or another method:
pip install poetry
3a. GPU Setup (Optional)
If you plan to use a GPU for training, configure the project with the following commands:
poetry source add pytorch-gpu https://download.pytorch.org/whl/cu118 --priority=explicit &&
poetry source remove pytorch-cpu || true
3b. CPU Setup (Optional)
If you plan to use a CPU for training, configure the project with the following commands:
poetry source add pytorch-cpu https://download.pytorch.org/whl/cpu --priority=explicit &&
poetry source remove pytorch-gpu || true
These steps will ensure that the project is ready for GPU-based execution.
4. Install Dependencies
Once the virtual environment is activated and Poetry is installed, run the following command to install all project dependencies:
poetry install --no-root --with dev
By completing these steps, your environment will be fully prepared to run the HMC Torch project.
5. Download dataset Using Kaggle (Optional)
1. Download dataset via Kaggle CLI
pip install kaggle
kaggle datasets download brunosette/gene-ontology-original
2. Download dataset via curl
# Export your Kaggle username and API key
# export KAGGLE_USERNAME=<YOUR USERNAME>
# export KAGGLE_KEY=<YOUR KAGGLE KEY>
curl -L -u $KAGGLE_USERNAME:$KAGGLE_KEY\
-o ~/Downloads/gene-ontology-original.zip\
https://www.kaggle.com/api/v1/datasets/download/brunosette/gene-ontology-original
mkdir data
unzip gene-ontology-original.zip -d data/
5. Running the Project
To execute the training process, follow these steps:
5.1. Make the Script Executable
Before running the training script, ensure it has the necessary execution permissions:
chmod +x run.sh
5.2. Run the Training Script
You can run the training script with the desired device configuration:
-
For CPU Execution:
./run.sh --device cpu
-
For GPU Execution:
./run.sh --device gpu
These commands will initiate the training process using the specified hardware.
5.3. Deploy Locally
To deploy the project locally, you only need to run the deployment script, as it automates all the steps outlined above. Ensure the script has execution permissions and specify the desired hardware configuration:
- Make the script executable:
chmod +x deploy_local.sh
- Run the deployment script:
- For GPU Deployment:
./deploy_local cuda - For CPU Deployment:
./deploy_local cpu
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file hmc_torch-0.0.7.tar.gz.
File metadata
- Download URL: hmc_torch-0.0.7.tar.gz
- Upload date:
- Size: 49.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0e5a967aedf810f71cbf1500e04fd44392392536ac813257c2dce5b1ac6e7a5f
|
|
| MD5 |
936cc676d7d6ee33f77ef3676e2d223d
|
|
| BLAKE2b-256 |
e7d658f8d1ebbb11f9531042599acccd5e2e2151a098007573854a9294d4bbee
|
Provenance
The following attestation bundles were made for hmc_torch-0.0.7.tar.gz:
Publisher:
python-publish.yml on Sette/hmc-torch
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
hmc_torch-0.0.7.tar.gz -
Subject digest:
0e5a967aedf810f71cbf1500e04fd44392392536ac813257c2dce5b1ac6e7a5f - Sigstore transparency entry: 1273870227
- Sigstore integration time:
-
Permalink:
Sette/hmc-torch@d131f5bcddf8618730023ba135d16e552e37876c -
Branch / Tag:
refs/tags/0.0.7 - Owner: https://github.com/Sette
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@d131f5bcddf8618730023ba135d16e552e37876c -
Trigger Event:
release
-
Statement type:
File details
Details for the file hmc_torch-0.0.7-py3-none-any.whl.
File metadata
- Download URL: hmc_torch-0.0.7-py3-none-any.whl
- Upload date:
- Size: 60.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
94e99108bce82032e97872a7334f51f041d0123a4c1e7925b2d310861e97c15d
|
|
| MD5 |
6860f413559a91243fb97b1899c8e025
|
|
| BLAKE2b-256 |
e0b7ec9365323e766886ad9206c798552a47555b6ec91a226cc7ffa173fca469
|
Provenance
The following attestation bundles were made for hmc_torch-0.0.7-py3-none-any.whl:
Publisher:
python-publish.yml on Sette/hmc-torch
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
hmc_torch-0.0.7-py3-none-any.whl -
Subject digest:
94e99108bce82032e97872a7334f51f041d0123a4c1e7925b2d310861e97c15d - Sigstore transparency entry: 1273870389
- Sigstore integration time:
-
Permalink:
Sette/hmc-torch@d131f5bcddf8618730023ba135d16e552e37876c -
Branch / Tag:
refs/tags/0.0.7 - Owner: https://github.com/Sette
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
python-publish.yml@d131f5bcddf8618730023ba135d16e552e37876c -
Trigger Event:
release
-
Statement type: