Skip to main content

A basic deep learning package

Project description

A simple neural network implementation which utilizes a simimilar API to TensorFlow's Sequential models.

Description

An in-depth paragraph about your project and overview of use.

Getting Started

Dependencies

  • NumPy
pip3 install -r requirements.txt

Sample Usage

Below is an example of a neural network with two Dense() layers using Tanh() activation functions learning the XOR function. Although seemingly trivial, the XOR function isn't linearly separable, meaning linear models such as logistic regression and single-layer perceptrons cannot learn XOR.

from nn import Sequential, Input, Dense, Tanh

np.random.seed(42)

# XOR input/output data
X = np.reshape([[0, 0], [0, 1], [1, 0], [1, 1]], (4, 2, 1))
Y = np.reshape([[0], [1], [1], [0]], (4, 1, 1))

# Model instantiation
model = Sequential([
    Input(2),
    Dense(3),
    Tanh(),
    Dense(1),
    Tanh(),
])
model.compile()

# Model training
model.fit(X, Y, epochs=10000, learning_rate=0.01)

# Predict
Y_pred = nn.predict(X)
for (y_true, y_pred) in zip(Y, Y_pred):
    print(f'Actual: {y_true}, Predicted: {y_pred}')

Output:

Actual: [[0]], Predicted: [[0.0003956]]
Actual: [[1]], Predicted: [[0.97018558]]
Actual: [[1]], Predicted: [[0.97092169]]
Actual: [[0]], Predicted: [[0.00186825]]

API

Sequential

nn.Sequential(
    layers=[]
)
Function Description
__init__ Instantiates a new Sequential model. If given a list of layers, it will add these to the model, similar to add().
add Add a single layer to the model.
compile Takes the added layers and the parameters of compile to generate a trainable model.
fit After compilation, fit() trains the model on its inputs and outputs.
predict Makes predictions after fitting to the data. Takes in a subset of the input data to make a prediction.

Layers

Layer Layer

Abstract class for the layers API. This shouldn't be used in an instance of a model.

Input Layer

This should always be the first layer of the Sequential model. Since the other layers take in an explicit output_size as their input, they infer their input_size from the previous layer's output_size. This means we must declare the model's first input_size.

Dense Layer

Just your regular densely-connected NN layer. At the moment, the dense layer only performs the dot product and bias addition, but no activation function. The activation function is out-sourced to the Activation layers.

Activation Layer

Applies an activation function to an output.

Tanh Layer

Hyperbolic tangent activation function.

Acknowledgements

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tensorfaux-0.0.1.tar.gz (154.6 kB view hashes)

Uploaded Source

Built Distribution

tensorfaux-0.0.1-py3-none-any.whl (6.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page