Skip to main content

A curated list of machine learning and deep learning helpers.

Project description

Core

Device Management

This section demonstrates how to easily select computation devices when using PyTorch. The homa helpers provide consistent interfaces for CPU, CUDA-enabled GPUs, and Apple Silicon MPS.

  • cpu(): Forces tensors or models onto the CPU.
  • cuda(): Moves tensors or models onto a CUDA GPU (if available). Commonly used in high‑performance training.
  • mps(): Uses Apple's Metal Performance Shaders backend on macOS.
  • get_device(): Automatically infers the best available device in the order: CUDA → MPS → CPU.
from homa import cpu, mps, cuda, get_device

# explicitly selecting devices
torch.tensor([1, 2, 3, 4, 5]).to(cpu())
torch.tensor([1, 2, 3, 4, 5]).to(cuda())
torch.tensor([1, 2, 3, 4, 5]).to(mps())

# automatic device selection
torch.tensor([1, 2, 3, 4, 5]).to(get_device())

This design mirrors common best practices in deep learning workflows, promoting device‑agnostic code.

Loading Settings

homa.settings allows you to attach a settings.json file to your project and access its values directly in your code. This is useful for hyperparameters, configuration management, or experiment logging.

Example settings.json:

{
  "epochs": 100,
  "learning_rate": 0.001
}

Loading settings in Python:

from homa import settings

for epoch in range(settings("epochs")):
    pass

The helper reads and caches the JSON content, providing dictionary‑like access without requiring boilerplate file‑loading logic.

Vision

Resnet

homa.vision.Resnet implements a standard ResNet‑50 architecture, commonly used in image classification tasks. This class bundles the model, optimizer, and training loop helpers for fast prototyping.

You can train the model directly using a PyTorch DataLoader:

from homa.vision import Resnet

model = Resnet(num_classes=10, lr=0.001)
for epoch in range(10):
    model.train(train_dataloader)

Alternatively, you may manually unpack the DataLoader and pass data batches yourself:

from homa.vision import Resnet

model = Resnet(num_classes=10, lr=0.001)
for epoch in range(10):
    for x, y in train_dataloader:
        model.train(x, y)

This interface is influenced by modern PyTorch training utilities and mirrors patterns seen in high‑level frameworks while keeping full transparency over the training loop.

Loss Functions

Logit Normalization

LogitNorm is a modified cross‑entropy‑style loss that normalizes logits before computing the loss. This technique was introduced to improve calibration, robustness, and especially performance in ensembling scenarios where varied model outputs can lead to instability.

Typical benefits of LogitNorm include:

  • more stable gradients
  • improved probabilistic calibration
  • robustness to logit scaling differences across models
from homa.loss import LogitNorm

criterion = LogitNorm()

Logit normalization is related to works studying the effect of logit scaling on generalization and calibration in deep networks.

Activation Functions

The table below lists every module that subclasses ActivationFunction, summarizing the computation performed in forward, linking to the implementation, and indicating whether the module exposes learnable parameters.

ID Activation Formula (plain text)
1ADAf(x) = x if x ≥ 0, else x * exp(x)
2AOAFf(x) = max(0, x − b · a) + c · a
3AReLUf(x) = (1 + σ(b)) * max(0, x) + clamp(a, 0.01, 0.99) * min(0, x)
4ASiLUf(x) = arctan(x * σ(x))
5AbsLUf(x) = x if x ≥ 0, else α * |x|
6BaseDLReLUf(x) = x if x ≥ 0, else a * bₜ * x
7CaLUf(x) = x * (arctan(x) / π + 0.5)
8DLReLUInherits BaseDLReLU
9DLUf(x) = x if x ≥ 0, else x / (1 − x)
10DPReLUf(x) = a x if x ≥ 0, else b x
11DRLUf(x) = max(0, x − α)
12DerivativeSiLUf(x) = σ(x) * (1 + x (1 − σ(x)))
13DiffELUf(x) = x if x ≥ 0, else a * (x eˣ − b e^(b x))
14DoubleSiLUf(x) = x / (1 + exp(−(−x / (1 + e^(−x)))))
15DualLinef(x) = a x + m if x ≥ 0, else b x + m
16EANAFf(x) = x · g(h(x))
17Elliotf(x) = 0.5 + (0.5 x)/(1 + |x|)
18ExponentialDLReLUInherits BaseDLReLU
19ExponentialSwishf(x) = exp(−x) · σ(x)
20FReLUf(x) = x + b if x ≥ 0, else b
21FlattedTSwishf(x) = max(0, x) · σ(x) + t
22GeneralizedSwishf(x) = x · σ(exp(−x))
23Gishf(x) = x · ln(2 − exp(−exp(x)))
24IpLUf(x) = x if x ≥ 0, else x / (1 + |x|^α)
25LaLUf(x) = x · (1 − 0.5 e^(−x)) if x ≥ 0, else x · (0.5 e^x)
26LeLeLUf(x) = a x if x ≥ 0, else 0.01 a x
27LogSigmoidf(x) = ln(σ(x))
28Logishf(x) = x · ln(1 + σ(x))
29MSiLUf(x) = x σ(x) + 1/4 · e^(−x² − 1)
30MaxSigf(x) = max(x, σ(x))
31MinSinf(x) = min(x, sin(x))
32NLReLUf(x) = ln(1 + β · max(0, x))
33NReLUf(x) = x + a if x ≥ 0, else 0
34NoisyReLUInherits NReLU
35OAFf(x) = max(0, x) + x · σ(x)
36PERUf(x) = a x if x ≥ 0, else a x · e^(b x)
37PFLUf(x) = x · 0.5 · (1 + x / √(1 + x²))
38PLAFf(x) = x − δ if x ≥ 1; = −x − δ if x < −1; = |x|^d / d otherwise; δ = 1 − 1/d
39Phishf(x) = x · tanh(GELU(x))
40PiLUf(x) = a x + c (1 − a) if x ≥ c, else b x + c (1 − b)
41PoLUf(x) = x if x ≥ 0, else (1 − x)^(−α) − 1
42PolyLUf(x) = x if x ≥ 0, else 1/(1 − x) − 1
43REUf(x) = x if x ≥ 0, else x · exp(x)
44RReLUf(x) = x if x ≥ 0, else x / a, where a ∈ [lower, upper]
45RandomizedSlopedReLUInherits SlopedReLU
46ReCUInherits RePU
47RePUf(x) = max(0, x^α)
48ReQUInherits RePU
49ReSPf(x) = α x + ln 2 if x ≥ 0, else ln(1 + e^x)
50ReSechf(x) = x · sech(x)
51SGELUf(x) = α x · erf(x / √2)
52SaRaf(x) = x if x ≥ 0, else x / (1 + α e^(−β x))
53Serff(x) = x · erf(ln(1 + e^x))
54ShiLUf(x) = a · max(0, x) + b
55ShiftedReLUf(x) = max(x, −1)
56SiELUf(x) = x · σ(2 √(2/π) · (x + 0.044715 x³))
57SigLUf(x) = x if x ≥ 0, else (1 − e^(−2x)) / (1 + e^(−2x))
58SigmoidDerivativef(x) = e^(−x) · σ(x)²
59SinSigf(x) = x · sin((π/2) · σ(x))
60SineReLUf(x) = x if x ≥ 0, else ε · (sin x − cos x)
61SlopedReLUf(x) = α x if x ≥ 0, else 0
62Smishf(x) = x · tanh(ln(1 + σ(x)))
63SoftModulusQf(x) = x² (2 − |x|) if |x| ≤ 1, else |x|
64SoftModulusTf(x) = x · tanh(x / α)
65SoftsignRReLUf(x) = 1/(1 + x)² + x if x ≥ 0, else 1/(1 + x)² + a x
66StarReLUf(x) = a · (max(0, x))² + b
67Suishf(x) = max(x, x · exp(−|x|))
68TBSReLUf(x) = x · tanh((1 − e^(−x)) / (1 + e^(−x)))
69TSReLUf(x) = x · tanh(σ(x))
70TSiLULet α = x / (1 + e^(−x)), then f(x) = (e^α − e^(−α)) / (e^α + e^(−α))
71TangentBipolarSigmoidReLUInherits TBSReLU
72TangentSigmoidReLUInherits TSReLU
73TanhExpf(x) = x · tanh(e^x)
74TeLUf(x) = x · tanh(e^x)
75ThLUf(x) = x if x ≥ 0, else tanh(x / 2)
76TripleStateSwishLet a = σ(x), b = σ(x − α), c = σ(x − β); f(x) = x · a · (a + b + c)
77mReLUf(x) = min(max(0, 1 − x), max(0, 1 + x))

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

homa-0.3.313.tar.gz (38.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

homa-0.3.313-py3-none-any.whl (83.0 kB view details)

Uploaded Python 3

File details

Details for the file homa-0.3.313.tar.gz.

File metadata

  • Download URL: homa-0.3.313.tar.gz
  • Upload date:
  • Size: 38.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.6

File hashes

Hashes for homa-0.3.313.tar.gz
Algorithm Hash digest
SHA256 4fe41fd0e0f0b8e4f1fa9046074c4678eb5dd55b9225f8f0ad75d5d48d099b9d
MD5 bbdb141eb40bc64870113512930e1a83
BLAKE2b-256 12b3f33fd537152fdbebf5bcf5dfaa738648343407e1afda1f70c322368c338f

See more details on using hashes here.

File details

Details for the file homa-0.3.313-py3-none-any.whl.

File metadata

  • Download URL: homa-0.3.313-py3-none-any.whl
  • Upload date:
  • Size: 83.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.9.6

File hashes

Hashes for homa-0.3.313-py3-none-any.whl
Algorithm Hash digest
SHA256 75d002996adb93147f915f55752c052d4af07e4b906f34c359c372e9caba7af7
MD5 5417ebc5ae3e0a25f22f82a8bc4adcec
BLAKE2b-256 d59870386c270a71e496657802c927cc548c111a08b66d72c0e0d25de6c8cd3a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page