Skip to main content

Hierarchical Multi-Label Classification Network in Pytorch

Project description

HMC Torch Project Summary

Overview

This document summarizes the HMC Torch project, a hierarchical multi-label classification network implemented in PyTorch. The project is currently at version 0.0.1 and includes various Jupyter notebooks, scripts, and configuration files.

Key Updates

  • Version: 0.0.1
  • Main Changes:
    • Removed the main function from the training file to simplify the code structure.
    • Updated .gitignore to improve version control management.

Project Structure

This section outlines the key components of the project to help users navigate the codebase. The project contains the following key files and directories:

  • Notebooks:
    • Dataset.ipynb: Handles dataset loading and preprocessing.
    • Executer-model.ipynb: Contains the model execution logic.
    • Inference.ipynb: Used for making predictions with the trained model.
  • Scripts:
    • executer.py: Core execution script for the model.
  • Configuration:
    • pyproject.toml: Project configuration file.
    • poetry.lock: Dependency lock file.
  • Documentation:
    • README.md: Provides an overview and instructions for the project.
    • LICENSE: Licensing information for the project.

Prerequisites Installation

Before setting up the project, ensure you have the following prerequisites installed and configured:

1. Create a Virtual Environment

It is recommended to use a virtual environment to manage dependencies. Run the following command to create one:

python -m venv .venv

Activation Steps:

  • Command Prompt (Windows):
    source .\.venv\Scripts\activate
    
  • PowerShell (Windows):
    .\.venv\Scripts\Activate.ps1
    
  • Linux/MacOS:
    source .venv/bin/activate
    

2. Install Poetry

Poetry is used for dependency management. Install it using pip or another method:

pip install poetry

3a. GPU Setup (Optional)

If you plan to use a GPU for training, configure the project with the following commands:

poetry source add pytorch-gpu https://download.pytorch.org/whl/cu118 --priority=explicit &&
poetry source remove pytorch-cpu || true

3b. CPU Setup (Optional)

If you plan to use a CPU for training, configure the project with the following commands:

poetry source add pytorch-cpu https://download.pytorch.org/whl/cpu --priority=explicit &&
poetry source remove pytorch-gpu || true

These steps will ensure that the project is ready for GPU-based execution.

4. Install Dependencies

Once the virtual environment is activated and Poetry is installed, run the following command to install all project dependencies:

poetry install --no-root --with dev

By completing these steps, your environment will be fully prepared to run the HMC Torch project.

5. Download dataset Using Kaggle (Optional)

1. Download dataset via Kaggle CLI
pip install kaggle
kaggle datasets download brunosette/gene-ontology-original
2. Download dataset via curl
# Export your Kaggle username and API key
# export KAGGLE_USERNAME=<YOUR USERNAME>
# export KAGGLE_KEY=<YOUR KAGGLE KEY>

curl -L -u $KAGGLE_USERNAME:$KAGGLE_KEY\
  -o ~/Downloads/gene-ontology-original.zip\
  https://www.kaggle.com/api/v1/datasets/download/brunosette/gene-ontology-original
mkdir data
unzip gene-ontology-original.zip -d data/

5. Running the Project

To execute the training process, follow these steps:

5.1. Make the Script Executable

Before running the training script, ensure it has the necessary execution permissions:

chmod +x run.sh

5.2. Run the Training Script

You can run the training script with the desired device configuration:

  • For CPU Execution:

    ./run.sh --device cpu
    
  • For GPU Execution:

    ./run.sh --device gpu
    

These commands will initiate the training process using the specified hardware.

5.3. Deploy Locally

To deploy the project locally, you only need to run the deployment script, as it automates all the steps outlined above. Ensure the script has execution permissions and specify the desired hardware configuration:

  1. Make the script executable:
chmod +x deploy_local.sh
  1. Run the deployment script:
  • For GPU Deployment:
    ./deploy_local cuda
    
  • For CPU Deployment:
    ./deploy_local cpu
    

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hmc_torch-0.0.7.tar.gz (49.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hmc_torch-0.0.7-py3-none-any.whl (60.6 kB view details)

Uploaded Python 3

File details

Details for the file hmc_torch-0.0.7.tar.gz.

File metadata

  • Download URL: hmc_torch-0.0.7.tar.gz
  • Upload date:
  • Size: 49.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for hmc_torch-0.0.7.tar.gz
Algorithm Hash digest
SHA256 0e5a967aedf810f71cbf1500e04fd44392392536ac813257c2dce5b1ac6e7a5f
MD5 936cc676d7d6ee33f77ef3676e2d223d
BLAKE2b-256 e7d658f8d1ebbb11f9531042599acccd5e2e2151a098007573854a9294d4bbee

See more details on using hashes here.

Provenance

The following attestation bundles were made for hmc_torch-0.0.7.tar.gz:

Publisher: python-publish.yml on Sette/hmc-torch

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file hmc_torch-0.0.7-py3-none-any.whl.

File metadata

  • Download URL: hmc_torch-0.0.7-py3-none-any.whl
  • Upload date:
  • Size: 60.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for hmc_torch-0.0.7-py3-none-any.whl
Algorithm Hash digest
SHA256 94e99108bce82032e97872a7334f51f041d0123a4c1e7925b2d310861e97c15d
MD5 6860f413559a91243fb97b1899c8e025
BLAKE2b-256 e0b7ec9365323e766886ad9206c798552a47555b6ec91a226cc7ffa173fca469

See more details on using hashes here.

Provenance

The following attestation bundles were made for hmc_torch-0.0.7-py3-none-any.whl:

Publisher: python-publish.yml on Sette/hmc-torch

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page