Skip to main content

Simple neural network interface including pre-trained model for the Kaggle Titanic dataset

Project description

Titanicbc

Titanicbc is a simple interface for training neural networks with custom hyper-parameters. The current version allows training a binary classifier network for the famous Kaggle Titanic dataset.

The aim of this package is to allow those with little or no neural network coding experience to learn how different hyper-parameter combinations affect neural network training. The package also includes a pre-trained neural network for demonstrating how networks make predictions once trained.

Later versions will expand the package to contain more flexible interfaces and networks for other classic datasets, including image and text datasets with convolutional and recurrent neural networks.

Installation

You can install Titanicbc from PyPI


pip install Titanicbc


How to use


Titanicbc provides a simple interface for training and using pre-trained networks via the config.yaml file.

The config.yaml file is included in the Python site-packages folder for Titanicbc. To find the python site-packages on your machine run python -m site from the command line or terminal and follow the path given by USER-SITE.

Once hyper-parameters have been set using config.yaml, simply run python -m Titanicbc from the command line or terminal to train a network or make predictions (depending on the value of train_new in config.yaml). The predictions made by the new or existing model will be saved into the same location in site-packages/Titanicbc as output.csv.


The options for config.yaml are presented below in the following format;

option number. Key (value options)

  1. train_new (True, False) - If true, a new neural network will be trained and overwrite trained_model.pth. If False the model parameters saved in trained_model.pth will be loaded and used for predictions.

  2. hidden_dim (Integer) - Number of neurons in each of the 3 hidden layers within the network.

  3. num_epochs (Integer) - Number of passes the network will make over the training data when training a new model.

  4. learning_rate (float) - Parameter multiplied to the weight updates during stochastic gradient descent. Currently only the Adam optimiser is used.

  5. weight_init (uniform, xavier) - Tells the network which type of initialisation to use for the model weights. Xavier is currently recommended


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Titanicbc-1.0.0.tar.gz (5.6 kB view hashes)

Uploaded Source

Built Distribution

Titanicbc-1.0.0-py3-none-any.whl (42.7 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page