Skip to main content

Embeddings and loss functions for different data types.

Project description

PyPI - Version testing badge coverage badge docs badge black badge

Embeddings and loss functions for different data types.

Installation

Install using pip:

pip install polytorch

Or install the latest version from GitHub:

pip install git+https://github.com/rbturnbull/polytorch.git

Data Types

This package allow you to input and output different data types in PyTorch models.

Binary Data

from polytorch import BinaryData

binary_data = BinaryData()

# Or with labels and colors
binary_data = BinaryData(labels=["no_feature", "with_feature", colors=["red", "blue"])

Categorical Data

from polytorch import CategoricalData

category_count = 5  # Number of categories
categorical_data = CategoricalData(category_count)

# Or with labels, colors and label smoothing
categorical_data = CategoricalData(
    category_count=category_count,
    labels=["cat", "dog", "fish", "bird", "reptile"],
    colors=["red", "blue", "green", "yellow", "purple"],
    label_smoothing=0.1,
)

Ordinal Data

from polytorch import OrdinalData

ordinal_data = OrdinalData()

# Or with color
ordinal_data = OrdinalData(color="pink")

Continuous Data

from polytorch import ContinuousData

continuous_data = ContinuousData()

# Or with color
continuous_data = ContinuousData(color="orange")

Hierarchical Data

from polytorch import HierarchicalData
from hiearchicalsoftmax import SoftmaxNode

root = SoftmaxNode("root")
child1 = SoftmaxNode("child1", parent=root)
child2 = SoftmaxNode("child2", parent=root)
tip1 = SoftmaxNode("tip1", parent=child1)
tip2 = SoftmaxNode("tip2", parent=child1)
tip3 = SoftmaxNode("tip3", parent=child2)
tip4 = SoftmaxNode("tip4", parent=child2)


hierarchical_data = HierarchicalData(root)

Embedding your data

from torch import nn
from polytorch import Embedding

class MyModule(nn.Module):
    def __init__(self, embedding_size:int=128):
        super(MyModule, self).__init__()

        input_types = [binary_data, categorical_data] # for example. Could be other data types as well.
        self.embedding = PolyEmbedding( input_types=input_types, embedding_size=embedding_size)

        # Other modules
        ...

    def forward(self, x_binary, x_categorical):

        embedded = self.embedding( x_binary, x_categorical )

        # Use the embedded features in your model
        ...

Outputting your data

You can also get your model to output to different data types.

from torch import nn
from polytorch import PolyLazyLinear

output_types = [
    CategoricalData(category_count=5, loss_weighting=0.5),  # For example, a categorical output with 5 categories
    BinaryData(loss_weighting=1.0),                        # A binary output
    ContinuousData(loss_weighting=0.1)                     # A continuous output
]

class MyModule(nn.Module):
    def __init__(self, output_types):
        super(MyModule, self).__init__()

        self.output = PolyLazyLinear(output_types=output_types)

    def forward(self, x):
        # Your model logic
        ...

        # Output to different data types
        return self.output(x)

Then add set this as the loss:

from polytorch import PolyLoss

loss_module = PolyLoss(output_types=output_types)

# In your training loop
loss = loss_module(predictions, categorical_target, binary_target, continuous_target)

Credits

Robert Turnbull For more information contact: <robert.turnbull@unimelb.edu.au>

Created using torchapp (https://github.com/rbturnbull/torchapp).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

polytorch-0.1.3.tar.gz (14.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

polytorch-0.1.3-py3-none-any.whl (16.0 kB view details)

Uploaded Python 3

File details

Details for the file polytorch-0.1.3.tar.gz.

File metadata

  • Download URL: polytorch-0.1.3.tar.gz
  • Upload date:
  • Size: 14.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.1 CPython/3.13.1 Darwin/24.5.0

File hashes

Hashes for polytorch-0.1.3.tar.gz
Algorithm Hash digest
SHA256 547d54c3b834544f9ac0f78b4de5c2055782d0ef8edd7e5921366fb66a202677
MD5 8ac302b1fd2f93c45267c956c6c3faa5
BLAKE2b-256 9357c9224ae6a800f1e9c7d35c4a8e8acba362a6dac14958b5d8939f384ec3ea

See more details on using hashes here.

File details

Details for the file polytorch-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: polytorch-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 16.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.1 CPython/3.13.1 Darwin/24.5.0

File hashes

Hashes for polytorch-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 ec206e17ee6bab9a0927e8ce1a2f517736c3ed3a2fe1cb48150ec0d926765933
MD5 a6abee5362a434a2f1da3f60487666af
BLAKE2b-256 9091dcefe19f787cee7c71906ee6759662e982f58429890cf8c8bac971910eb8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page