A simple and flexible python library that allows you to build custom Neural Networks where you can easily tweak parameters to change how your network behaves
Project description
Flexible_Neural_Net
A simple and flexible python library that allows you to build custom Neural Networks where you can easily tweak parameters to change how your network behaves
Installation
pip install flexible-neural-network
Initialization
-
First initialize a Neural Net object and pass number of inputs, outputs, and hidden layers
myNN = NeuralNet(number_of_inputs, number_of_outputs, number_of_hidden_layers)
-
You can choose how what activation function to use from: "relu", "sigmoid, "tanh"
myNN = NeuralNet(number_of_inputs, number_of_outputs, number_of_hidden_layers, activation_func="sigmoid")
-
You can choose modify the learning rate
myNN = NeuralNet(number_of_inputs, number_of_outputs, number_of_hidden_layers, learning_rate=0.1)
-
You can choose tweak the number of nodes in each hidden layer
-
by assigning an integer number such as 3: if there was 4 hidden layers then each layer will have 3 nodes => [3, 3, 3, 3]
myNN = NeuralNet(number_of_inputs, number_of_outputs, number_of_hidden_layers, nodes_in_each_layer=3)
-
by assigning a list of integers number such as [3, 5, 2, 3] that has a length of number_of_hidden_layers: if there was 4 hidden layers then each layer will have different number of nodes nodes correspondingly => [3, 5, 2, 3]
myNN = NeuralNet(number_of_inputs, number_of_outputs, number_of_hidden_layers, nodes_in_each_layer=[3, 5, 2, 3])
-
How to use
Assuming you initialized your object and data as below:
myNN = NeuralNet(2, 1, 2, nodes_in_each_layer=4, learning_rate=0.1, activation_func="sigmoid")
data = np.array([
[3, 1.5, 1],
[2, 1, 0],
[4, 1.5, 1],
[3, 1, 0],
[3.5, .5, 1],
[2, .5, 0],
[5.5, 1, 1],
[1, 1, 0]
])
mystery_data = [2, 1] # should be classified as 1
You can:
Here we specified the number of epochs to be 1
-
Train single entries:
myNN.train(data[0, 0:2], data[0, 2], epochs=1)
-
Train multiple entries
myNN.train_many(data[:, 0:2], data[:, 2], epochs=1)
-
test single/multiple entries
output = myNN.test(mystery_flower)
where output is always an np.ndarray with size as the specfied in the object's constructor. for the current example it's = [1.45327823] -
Save NN for later
myNN.save("file_name")
-
Load NN without the need for retraining
myNN = NeuralNet.load("file_name")
Obviously NNs do not give exact answers and its our job to determine which class is it belongs to and judging from the training data we only have class 0, or 1 and the output we got is nearer to 1 than 0 so we should classify it as 1
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file flexible-neural-network-0.0.42.tar.gz
.
File metadata
- Download URL: flexible-neural-network-0.0.42.tar.gz
- Upload date:
- Size: 5.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.35.0 CPython/3.7.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 69745b34ba82e0fa85fde4710a616061ef898eb7ad863ee1e31b634ea11fe474 |
|
MD5 | 31bccff5b043742d8abd082e2201b971 |
|
BLAKE2b-256 | 0f292e25fe08981d21698b48f00030c3fc48412c146bf5ac654b8966279b3512 |
File details
Details for the file flexible_neural_network-0.0.42-py3-none-any.whl
.
File metadata
- Download URL: flexible_neural_network-0.0.42-py3-none-any.whl
- Upload date:
- Size: 6.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.35.0 CPython/3.7.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 77f950912b2cc9b67f9729663d074c18275a00db009b5a89108e79ffedbba260 |
|
MD5 | 13648811bd5af11ea692b8d77f44744a |
|
BLAKE2b-256 | 4525bb6e43091f514fd3b36958c40ec4df42403c77f01ef4a5ce08b66e24f763 |