Adaptive Input/Output Normalization for deep neural networks. Enables stable training of extremely deep networks through adaptive residual scaling (Alpha) by Babayev's Theory
Project description
AION-Torch: Adaptive Input/Output Normalization
AION-Torch is a PyTorch library that implements Adaptive Input/Output Normalization (AION), a method for stabilizing deep neural networks. AION automatically adjusts residual connections to prevent vanishing and exploding gradients, enabling stable training of very deep networks with minimal configuration.
🚀 Features
- Adaptive Residual Scaling: Automatically adjusts residual connection strength based on signal statistics
- Stable Deep Training: Prevents vanishing/exploding gradients even in networks with 1000+ layers
- Drop-in Replacement: Works with any architecture using residual connections (Transformers, ResNets, etc.)
- Distributed Ready: Fully supports DDP with synchronized statistics across all GPUs
- Zero Config: Sensible defaults work out-of-the-box, no hyperparameter tuning needed
📦 Installation
From PyPI
pip install aion-torch
⚡ Quick Start
1. The AionBlock (Recommended)
The easiest way to use AION is to replace your standard residual blocks with AionBlock. It implements the Pre-LayerNorm pattern augmented with AION scaling.
import torch
import torch.nn as nn
from aion_torch import AionBlock
# Define your transformation layer (e.g., Attention or MLP)
mlp_layer = nn.Sequential(
nn.Linear(512, 2048),
nn.GELU(),
nn.Linear(2048, 512)
)
# Wrap it in an AionBlock
# Structure: x + alpha * layer(norm(x))
block = AionBlock(layer=mlp_layer, dim=512)
# Forward pass
x = torch.randn(8, 128, 512)
output = block(x)
2. Low-Level AionResidual
For custom architectures, you can use the AionResidual adapter directly.
from aion_torch import AionResidual
class MyLayer(nn.Module):
def __init__(self, dim):
super().__init__()
self.norm = nn.LayerNorm(dim)
self.ffn = nn.Linear(dim, dim)
# Initialize AION adapter
self.aion = AionResidual(alpha0=0.1, beta=0.05)
def forward(self, x):
residual = x
x_norm = self.norm(x)
y = self.ffn(x_norm)
# Apply adaptive residual connection
# Formula: x + alpha * y
return self.aion(residual, y)
🧠 How It Works
AION adaptively scales residual connections using a simple but effective formula:
$$ \alpha = \frac{\alpha_0}{1 + \beta \cdot \text{ratio}} $$
where ratio measures the relative magnitude of the transformation output compared to the input. When the network becomes unstable (high ratio), AION automatically reduces the scaling factor. When stable (low ratio), it uses a stronger connection.
Key insight: By maintaining balanced signal propagation, AION ensures gradients flow stably through arbitrarily deep networks without exponential growth or decay.
AION as the General Form
Mathematically, other stabilization methods are just special cases of the AION formula where adaptivity ($\beta$) is turned off:
| Method | AION Equivalent Parameters | Behavior |
|---|---|---|
| DeepNorm | $\beta=0, \alpha_0 = \frac{1}{\sqrt{2L}}$ | Fixed static scaling based on depth |
| Pre-LN | $\beta=0, \alpha_0 = 1$ | No scaling (identity) |
| ReZero | $\beta=0, \alpha_0 = \text{learnable}$ | Learnable static scalar |
| AION | $\beta > 0$ | Dynamic adaptation based on signal energy |
AION generalizes these approaches by adding the control term ($1 + \beta \cdot \text{ratio}$), allowing it to react to instability in real-time rather than relying on static assumptions.
📚 Documentation
For the theoretical foundation and mathematical proofs, see the following documents:
- Balance Theory - Core theoretical foundation for AION
These are more general math papers that inspired the ideas, but are not required to use the library:
🤝 Contributing
Contributions are welcome! Please read our Contributing Guide (coming soon) and check out the issues.
- Fork the repository
- Create your feature branch (
git checkout -b feature/amazing-feature) - Commit your changes (
git commit -m 'Add some amazing feature') - Push to the branch (
git push origin feature/amazing-feature) - Open a Pull Request
📜 License
This project is licensed under the MIT License - see the LICENSE file for details.
Built with ❤️ for the ML community
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file aion_torch-1.0.0.tar.gz.
File metadata
- Download URL: aion_torch-1.0.0.tar.gz
- Upload date:
- Size: 19.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
824bc60a1efc8848f63f06cc191023d50458053f7770b98c8d1566a56284887e
|
|
| MD5 |
aa4104e4fef62d6fc5ca3571842f3537
|
|
| BLAKE2b-256 |
cbf35f5463cf44c991116b49cd678f6690397ea6c3ec83a735a1eeb1577e0a73
|
File details
Details for the file aion_torch-1.0.0-py3-none-any.whl.
File metadata
- Download URL: aion_torch-1.0.0-py3-none-any.whl
- Upload date:
- Size: 10.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.14.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
9c731445fb6feb5ed015961269283f8e6987bdef2497848edd3bf34c4648f7c6
|
|
| MD5 |
b165634b96d430fe95239dd59b2fc2d8
|
|
| BLAKE2b-256 |
7e96f46d1bd5cd4766b824dd64ecd993e26a221d3efae9dc63df525720981c1c
|