Skip to main content

Monotonic Dense Layer implemented in Keras

Project description

Monotonic Dense Layer

This Python library implements Monotonic Dense Layer as described in Davor Runje, Sharath M. Shankaranarayana, “Constrained Monotonic Neural Networks”, https://https://arxiv.org/abs/2205.11775.

If you use this library, please cite:

@misc{https://doi.org/10.48550/arxiv.2205.11775,
  doi = {10.48550/ARXIV.2205.11775},
  url = {https://arxiv.org/abs/2205.11775},
  author = {Davor Runje and Sharath M. Shankaranarayana},
  title = {Constrained Monotonic Neural Networks},
  publisher = {arXiv},
  year = {2022},
  copyright = {Creative Commons Attribution Non Commercial Share Alike 4.0 International}
}

Install

pip install mono-dense-keras

How to use

First, we’ll create a simple dataset for testing using numpy. Inputs values $x_1$, $x_2$ and $x_3$ will be sampled from the normal distribution, while the output value $y$ will be calculated according to the following formula before adding noise to it:

$y = x_1^3 + \sin\left(\frac{x_2}{2 \pi}\right) + e^{-x_3}$

import numpy as np

rng = np.random.default_rng(42)

def generate_data(no_samples: int, noise: float):
    x = rng.normal(size=(no_samples, 3))
    y = x[:, 0] ** 3
    y += np.sin(x[:, 1] / (2*np.pi))
    y += np.exp(-x[:, 2])
    y += noise * rng.normal(size=no_samples)
    return x, y

x_train, y_train = generate_data(10_000, noise=0.1)
x_val, y_val = generate_data(10_000, noise=0.)

Now, we’ll use the MonoDense layer instead of Dense layer. By default, the MonoDense layer assumes the output of the layer is monotonically increasing with all inputs. This assumtion is always true for all layers except possibly the first one. For the first layer, we use monotonicity_indicator to specify which input parameters are monotonic and to specify are they increasingly or decreasingly monotonic: - set 1 for increasingly monotonic parameter,

  • set -1 for decreasingly monotonic parameter, and

  • set 0 otherwise.

In our case, the monotonicity_indicator is [1, 0, -1] because $y$ is: - monotonically increasing w.r.t. $x_1$ $\left(\frac{\partial y}{x_1} = 3 {x_1}^2 \geq 0\right)$, and

  • monotonically decreasing w.r.t. $x_3$ $\left(\frac{\partial y}{x_3} = - e^{-x_2} \leq 0\right)$.
from tensorflow.keras import Sequential
from tensorflow.keras.layers import Input, Dense
from mono_dense_keras import MonoDense

# build a simple model with 3 hidden layer, but this using MonotonicDense layer
model = Sequential()

model.add(Input(shape=(3,)))
monotonicity_indicator = [1, 0, -1]
model.add(MonoDense(128, activation="elu", monotonicity_indicator=monotonicity_indicator))
model.add(MonoDense(128, activation="elu"))
model.add(MonoDense(1))

model.summary()
Model: "sequential_1"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 mono_dense_2 (MonoDense)    (None, 128)               512       
                                                                 
 mono_dense_3 (MonoDense)    (None, 128)               16512     
                                                                 
 mono_dense_4 (MonoDense)    (None, 1)                 129       
                                                                 
=================================================================
Total params: 17,153
Trainable params: 17,153
Non-trainable params: 0
_________________________________________________________________
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.optimizers.schedules import ExponentialDecay

def train_model(model, initial_learning_rate):
    # train the model
    lr_schedule = ExponentialDecay(
        initial_learning_rate=initial_learning_rate,
        decay_steps=10_000 // 32,
        decay_rate=0.9,
    )
    optimizer = Adam(learning_rate=lr_schedule)
    model.compile(optimizer="adam", loss="mse")

    model.fit(x=x_train, y=y_train, batch_size=32, validation_data=(x_val, y_val), epochs=10)
    
train_model(model, initial_learning_rate=1.)
Epoch 1/10
313/313 [==============================] - 2s 5ms/step - loss: 0.2590 - val_loss: 0.4990
Epoch 2/10
313/313 [==============================] - 1s 4ms/step - loss: 0.2875 - val_loss: 0.1390
Epoch 3/10
313/313 [==============================] - 1s 4ms/step - loss: 0.2241 - val_loss: 0.0790
Epoch 4/10
313/313 [==============================] - 1s 4ms/step - loss: 0.2297 - val_loss: 0.1043
Epoch 5/10
313/313 [==============================] - 1s 4ms/step - loss: 0.2502 - val_loss: 0.1089
Epoch 6/10
313/313 [==============================] - 1s 4ms/step - loss: 0.2231 - val_loss: 0.0590
Epoch 7/10
313/313 [==============================] - 1s 4ms/step - loss: 0.1715 - val_loss: 0.5466
Epoch 8/10
313/313 [==============================] - 1s 4ms/step - loss: 0.1890 - val_loss: 0.0863
Epoch 9/10
313/313 [==============================] - 1s 4ms/step - loss: 0.1655 - val_loss: 0.1200
Epoch 10/10
313/313 [==============================] - 1s 4ms/step - loss: 0.2332 - val_loss: 0.1196

License

The full text of the license is available at:

https://github.com/airtai/mono-dense-keras/blob/main/LICENSE

You are free to: - Share — copy and redistribute the material in any medium or format

  • Adapt — remix, transform, and build upon the material

The licensor cannot revoke these freedoms as long as you follow the license terms.

Under the following terms: - Attribution — You must give appropriate credit, provide a link to the license, and indicate if changes were made. You may do so in any reasonable manner, but not in any way that suggests the licensor endorses you or your use.

  • NonCommercial — You may not use the material for commercial purposes.

  • ShareAlike — If you remix, transform, or build upon the material, you must distribute your contributions under the same license as the original.

  • No additional restrictions — You may not apply legal terms or technological measures that legally restrict others from doing anything the license permits.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mono-dense-keras-0.0.6rc2.tar.gz (24.6 kB view hashes)

Uploaded Source

Built Distribution

mono_dense_keras-0.0.6rc2-py3-none-any.whl (22.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page