Skip to main content

A simple and flexible python library that allows you to build custom Neural Networks where you can easily tweak parameters to change how your network behaves

Project description

Flexible_Neural_Net

A simple and flexible python library that allows you to build custom Neural Networks where you can easily tweak parameters to change how your network behaves

Installation

pip install flexible-neural-network

Initialization

  • First initialize a Neural Net object and pass number of inputs, outputs, and hidden layers

    myNN = NeuralNet(number_of_inputs, number_of_outputs, number_of_hidden_layers)

  • You can choose how what activation function to use from: "relu", "sigmoid, "tanh"

    myNN = NeuralNet(number_of_inputs, number_of_outputs, number_of_hidden_layers, activation_func="sigmoid")

  • You can choose modify the learning rate

    myNN = NeuralNet(number_of_inputs, number_of_outputs, number_of_hidden_layers, learning_rate=0.1)

  • You can choose tweak the number of nodes in each hidden layer

    • by assigning an integer number such as 3: if there was 4 hidden layers then each layer will have 3 nodes => [3, 3, 3, 3]

      myNN = NeuralNet(number_of_inputs, number_of_outputs, number_of_hidden_layers, nodes_in_each_layer=3)

    • by assigning a list of integers number such as [3, 5, 2, 3] that has a length of number_of_hidden_layers: if there was 4 hidden layers then each layer will have different number of nodes nodes correspondingly => [3, 5, 2, 3]

      myNN = NeuralNet(number_of_inputs, number_of_outputs, number_of_hidden_layers, nodes_in_each_layer=[3, 5, 2, 3])

How to use

Assuming you initialized your object and data as below:

myNN = NeuralNet(2, 1, 2, nodes_in_each_layer=4, learning_rate=0.1, activation_func="sigmoid")

data = np.array([
        [3,   1.5, 1],
        [2,   1,   0],
        [4,   1.5, 1],
        [3,   1,   0],
        [3.5, .5,  1],
        [2,   .5,  0],
        [5.5,  1,  1],
        [1,    1,  0]
        ])

mystery_data = [2, 1] # should be classified as 1

You can:

Here we specified the number of epochs to be 1

  • Train single entries:
    myNN.train(data[0, 0:2], data[0, 2], epochs=1)

  • Train multiple entries myNN.train_many(data[:, 0:2], data[:, 2], epochs=1)

  • test single/multiple entries output = myNN.test(mystery_flower) where output is always an np.ndarray with size as the specfied in the object's constructor. for the current example it's = [1.45327823]

  • Save NN for later myNN.save("file_name")

  • Load NN without the need for retraining myNN = NeuralNet.load("file_name")

Obviously NNs do not give exact answers and its our job to determine which class is it belongs to and judging from the training data we only have class 0, or 1 and the output we got is nearer to 1 than 0 so we should classify it as 1

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

flexible-neural-network-0.0.41.tar.gz (5.1 kB view hashes)

Uploaded Source

Built Distribution

flexible_neural_network-0.0.41-py3-none-any.whl (6.8 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page