A Neural Network Module to create Custom Dense Neural Networks
Project description
Overview
Custom-Neural-Net-Creator is an easy, to use tool that allows developers and machine learning enthusiasts to create and deploy their own custom deep neural networks for various purposes. This innovative tool makes the power of learning more accessible, than before.
Neural networks are advanced machine learning algorithms that can detect complex patterns in large amounts of data and make predictions based on the information they have been trained on. These systems are designed to mimic the structure of the human brain, with interconnected layers of “neurons” that transmit information to each other. An artificial neural network typically consists of an input layer, one or more hidden layers, and an output layer, each with a specific number of neurons connected to preceding and following layers. Deep neural networks are a subtype of artificial neural networks that can learn from large datasets and make predictions based on them.
To view more information visit the documentation.
Quickstart
import numpy as np
from custom_neural_net_creator.model import Model
from custom_neural_net_creator.dense import Dense
from custom_neural_net_creator.activation_layer import ActivationLayer
from custom_neural_net_creator.activation_functions import relu, relu_derivative, sigmoid, sigmoid_derivative, tanh, tanh_prime
from custom_neural_net_creator.loss_functions import mean_squared_error, mean_squared_error_derivative
#Input data for XOR
x = [[0,0], [0,1], [1,0], [1,1]]
y = [[0], [1], [1], [0]]
model = Model()
model.add(Dense(2, 10)) #Input takes in two inputs
model.add(ActivationLayer(relu, relu_derivative)) #First hidden layer has 10 neurons and uses RELU
model.add(Dense(10, 10))
model.add(ActivationLayer(relu, relu_derivative)) #Second hidden layer has 10 neurons and uses RELU
model.add(Dense(10,1))
model.add(ActivationLayer(sigmoid, sigmoid_derivative)) #Output layer is one neuron with Sigmoid as activation
#Train on training data
model.fit(x,y,mean_squared_error,mean_squared_error_derivative,epochs=1000,learning_rate=0.1,verbosity=3)
#Loss of Epoch #1000: 0.0002757698731393589
#Test model
predictions = model.predict(x[0:3])
print("Predicted: ")
print(predictions) #Predicted: [array([[0.02610931]]), array([[0.98778214]]), array([[0.9873547]])]
print("Actual:")
print(y[0:3])
# Actual:
# [[[0]]
# [[1]]
# [[1]]]
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for custom_neural_net_creator-2.0.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 82124b9354519efe9766ec4a50beff3fbc38916666caf6811ec7679c9d5aa758 |
|
MD5 | f73e33f53c7431f56a48a9c7b11cd456 |
|
BLAKE2b-256 | 26fc649d3aff47c055a223e3531ea6d098718335c245a11b055562959312af3c |
Hashes for custom_neural_net_creator-2.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 15a9b2afc5062825713af287d626f8721b5b3366580bf35dcc1a68a504b50b7f |
|
MD5 | 20e53f1ed0b0abeedeb8f0089e0d005d |
|
BLAKE2b-256 | f34092e24ba885d263d357a9adde41ffba7fb9500043f5576313dc0df535496c |