Skip to main content

Feed Forward Neural Networks

Project description

Feed Forward Neural Networks using NumPy

This library is a modification of my previous one. Click Here to check my previous library.

Installation

$ [sudo] pip3 install nicenet

Development Installation

$ git clone https://github.com/Subhash3/Neural_Net_Using_NumPy.git

Usage

>>> from nicenet import NeuralNetwork

Creating a Neural Network

inputs = 2
outputs = 1
network = NeuralNetwork(inputs, outputs, cost="mse")

# Add 2 hidden layers with 16 neurons each and activation function 'tanh'
network.add_layer(16, activation_function="tanh") 
network.add_layer(16, activation_function="tanh")

# Finish the neural network by adding the output layer with sigmoid activation function.
network.compile(activation_function="sigmoid")

Building a dataset

The package contains a Dataset class to create a dataset.

>>> from nicenet import Dataset

Make sure you have inputs and target values in seperate files in csv format.

input_file = "inputs.csv"
target_file = "targets.csv"

# Create a dataset object with the same inputs and outputs defined for the network.
dataset_handler = Dataset(inputs, outputs)
dataset_handler.make_dataset(input_file, target_file)
data, size = dataset_handler.get_raw_data()

If you want to manually make a dataset, follow these rules:

  • Dataset must be a list of data samples.
  • A data sample is a tuple containing inputs and target values.
  • Input and target values are column vector of size (inputs x 1) and (outputs x 1) respectively.

For eg, a typical XOR data set looks something like :

>>> XOR_data = [
    (
        np.array([[0], [0]]),
        np.array([[0]])
    ),
    (
        np.array([[0], [1]]),
        np.array([[1]])
    ),
    (
        np.array([[1], [0]]),
        np.array([[1]])
    ),
    (
        np.array([[1], [1]]),
        np.array([[0]])
    )
]
>>> size = 4

Training The network

The library provides a Train function which accepts the dataset, dataset size, and two optional parameters epochs, and logging.

def Train(self, dataset: T_Dataset, size, epochs=100, logging=False, epoch_logging=True, prediction_evaulator=None):
	....
	....

For Eg: If you want to train your network for 1000 epochs.

>>> network.Train(data, size, epochs=1000)

Notice that I didn't change the value of logging as I want the output to be printed for each epoch.

Debugging

Plot a nice epoch vs error graph

>>> network.epoch_vs_error()

Know how well the model performed.

>>> network.evaluate()

To take a look at all the layers' info

>>> network.display()

Sometimes, learning rate might have to be altered for better convergence.

>>> network.set_learning_rate(0.1)

Exporting Model

You can export a trained model to a json file which can be loaded and used for predictions in the future.

filename = "model.json"
network.export_model(filename)

Load Model

To load a model from an exported model (json) file. load_model is a static function, so you must not call this on a NeuralNetwork object!.

filename = "model.json"
network = NeuralNetwork.load_model(filename)

Todo

- [x] Generalize the gradient descent algorithm
    - [x] Generalise the loss function => Write a separate class for it!
- [x] Implement Cross Entropy Loss
- [ ] Data scaling
    - [x] Min Max scaler
    - [ ] Data Standardization
- [x] Change the datasample type to a tuple instead of a list.
- [x] Show Progress bar if epoch_logging is False
- [x] Use a function as a parameter to Train method to compare predictions and actual targets.
- [x] convert all camel-cased vars to snake-case.

- [ ] API docs
    - [x] Add doc strings to all functions.
    - [x] Make the class/function declarations' docs collapsable.
    - [ ] Merge API md files and embed them in Readme.
    - [ ] Create a section, API, in README to provide documentation for all prototypes.

- [ ] Implement Batch Training
- [ ] Write a separate class for Scalers as the scaling methods increase.
- [ ] Linear and Relu activation functions
- [ ] Ability to perform regression
- [ ] Separate out outputlayer from other layers. => Create a separate class for output layer which inherits Layer.


- [ ] Convolution Nets
- [ ] Recurrent Nets

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nicenet-0.0.7.tar.gz (14.1 kB view hashes)

Uploaded Source

Built Distribution

nicenet-0.0.7-py3-none-any.whl (14.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page