Skip to main content

fully connected neural network with four layers

Project description

Fully connected four-layer neural network
Solves a huge number of cases, classification and regression
The following sequence explains how to use with the help of two example files.
The first file contains the learning process, where the neural network finds its weights
The second file demonstrates the network's ability to make predictions on new, unseen data that is not part of the training set


#Manual = https://www.mediafire.com/file/vtzpb8ne1g92mgz/Manual_Tupa123.pdf

#Excel example data = https://www.mediafire.com/file/k8xkw4592tb9uab/ALETAS.xlsx


#-----Files without comments:---------------------------------------

#-----FILE TO MACHINE LEARNING

import tupa123 as tu

X = tu.ExcelMatrix('ALETAS.xlsm', 'Plan1', Lineini=2, Columini=1, columnquantity=5, linesquantity=300)
y = tu.ExcelMatrix('ALETAS.xlsm', 'Plan1', Lineini=2, Columini=6, columnquantity=2, linesquantity=300)

model = tu.nnet4(norma=5, coef=0, nn1c=5, nn2c=7, nn3c=5, nn4c=2, rate=0.01, epochs=2000, fa2c=5, fa3c=5, fa4c=0)
model.Fit_ADAM(X, y)
model.Plotconv()

input('end')

#-----FILE TO APPLICATION OF MACHINE LEARNING

import tupa123 as tu

model = tu.nnet4(norma=5, coef=0, normout=1, nn1c=5, nn2c=7, nn3c=5, nn4c=2, fa2c=5, fa3c=5, fa4c=0)
X_new = tu.ExcelMatrix('ALETAS.xlsm', 'Plan1', Lineini=2, Columini=1, columnquantity=5, linesquantity=1000)
y_resposta = tu.ExcelMatrix('ALETAS.xlsm', 'Plan1', Lineini=2, Columini=6, columnquantity=2, linesquantity=1000)
y_pred = model.Predict(X_new)

tu.Statistics(y_pred, y_resposta)
tu.PlotCorrelation(y_pred, y_resposta)
tu.PlotComparative(y_pred, y_resposta)
input('end')

#------Commented file:------------------------------------------

#-----MACHINE LEARNING

import tupa123 as tu
#import the library

X = tu.ExcelMatrix('ALETAS.xlsm', 'Plan1', Lineini=2, Columini=1, columnquantity=5, linesquantity=300)
y = tu.ExcelMatrix('ALETAS.xlsm', 'Plan1', Lineini=2, Columini=6, columnquantity=2, linesquantity=300)
#learning data
#The data can come from any source, but the ExcelMatrix function allows a practical interaction with Excel
#ExcelMatrix = collect data from excel, the spreadsheet needs to be in the same folder as the python file
#'ALETAS.xlsm' = example name of the excel file / 'Sheet1' = example name of the tab where the data are
#Lineini=2, Columini=1 = example initial row and column of data
#linesquantity = number of lines of learning data
#X = regression input data / y = data to be predicted

model = tu.nnet4(norma=5, coef=0, normout=1, nn1c=5, nn2c=7, nn3c=5, nn4c=2, rate=0.01, epochs=2000, fa2c=5, fa3c=5, fa4c=0, cost=0, regu=0, namenet='')
#creates the Neural Network model

#norma = type of data normalization: (default=2)
#=-1, standardization
#=0, do anything
#=1, between 0 and 1
#=2, between -1 and 1
#=3, log(x+coef)
#=4, log(x+coef) between 0 and 1
#=5, log(x+coef) between -1 and 1
#=6, log(x+coef) and standardization
#coef = used to avoid zero in log normalizations, example 0.0012345 (default=0)
#normout = if 1 normalizes the output (default=1), 0 dont

#nn1c=5, nn2c=7, nn3c=5, nn4c=2 = number of neurons from the first to the fourth layer (default=1,5,5,1)
#rate = learning rate (default=0.01)
#epochs = number of epochs (default=1000)
#fa2c=5, fa3c=5, fa4c=0 = second to fourth layer activation functions (default=5,5,0)
#for regression the fourth layer is recommended as linear = 0
#cost=0, cost function, (default=0). 0 = MSE, mean squared error for regression and classification / 1 = BCE, binary cross entropy for classification
#regu= regularization, (default=0). Usual value for regression = 0.01
#namenet= name of the folder where the weights are saved, default is the same directory as the .py file, necessary when working with more than one neural network

#Activation functions:
#=0 linear
#=1 Sigmoide
#=2 softpluss
#=3 gaussinana
#=4 ReLU
#=5 tanh
#=6 LReLU
#=7 arctan
#=8 exp
#=9 seno
#=10 swish
#=11 selu
#=12 logsigmoide
#=13 X2
#=14 X
3
#=15 Symmetric Rectified Linear

model.Fit_ADAM(X, y)
#machine learning
#model.Fit_ADAM(X, y) = single batch interpolation of all learning data, with ADAM accelerator
#model.Fit_STOC(X, y) = case-by-case interpolation, stochastic gradient descent
#model.Fit_STOC_ADAM(X, y) = case-by-case interpolation, stochastic with ADAM

model.Plotconv()
#Plot the convergence process

input('End')

#-----APPLICATION OF MACHINE LEARNING

import tupa123 as tu

model = tu.nnet4(norma=5, coef=0, nn1c=5, nn2c=7, nn3c=5, nn4c=2, fa2c=5, fa3c=5, fa4c=0)
#application file must be in the same folder as the learning file
#where some .txt files were generated with the neural network settings
#neural network must have the same configuration that was used in the learning phase

X_new = tu.ExcelMatrix('ALETAS.xlsm', 'Plan1', Lineini=2, Columini=1, columnquantity=5, linesquantity=1000)
#variables to be predicted

y_resposta = tu.ExcelMatrix('ALETAS.xlsm', 'Plan1', Lineini=2, Columini=6, columnquantity=2, linesquantity=1000)
#right answer to compare, to evaluate neural network performance

y_pred = model.Predict(X_new)
#prediction, neural network result

tu.Statistics(y_pred, y_resposta)
#Statistical evaluation of the results
#It does some basic statistics: mean difference, standard deviation and correlation coefficient between predicted and target variable

tu.PlotCorrelation(y_pred, y_resposta)
#Calculated and target correlation plot

tu.PlotCorrelation2(y_pred, y_resposta)
#Calculated and target correlation plot with standard deviation lines

tu.PlotComparative(y_pred, y_resposta)
#Calculated and target comparative plot

tu.PlotComparative2(y_pred, y_resposta, window_size=1000)
#Error plot with movel average

tu.PlotComparative3(y_pred, y_resposta)
#Calculated and target comparative plot with standard deviation areas

tu.PlotComparative4(y_pred, y_resposta)
#Plot 2 sigma tandard deviation areas with target

tu.PlotDispe(y_pred, y_resposta)
#Error dispersion

tu.PlotDispe2(y_pred, y_resposta)
#Error dispersion with error proportion

tu.PlotHisto(y_pred, y_resposta)
#Percentage error histogram

input('end')

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tupa123-1.2.21.tar.gz (13.8 kB view details)

Uploaded Source

File details

Details for the file tupa123-1.2.21.tar.gz.

File metadata

  • Download URL: tupa123-1.2.21.tar.gz
  • Upload date:
  • Size: 13.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.2

File hashes

Hashes for tupa123-1.2.21.tar.gz
Algorithm Hash digest
SHA256 cf6cc6884fa0d7a4aeb8e972b5c9e6036dafe82f5e3446b328d1d86877606646
MD5 9674d3f4b69216115963dad429049d71
BLAKE2b-256 689648af63a2fd5487d516b4c9f5fe68317afd1355b0f5cacab482d8f4957090

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page