Skip to main content

Simple neural network interface including pre-trained model for the Kaggle Titanic dataset

Project description

Titanicbc

Titanicbc is a simple interface for training neural networks with custom hyper-parameters. The current version allows training a binary classifier network for the famous Kaggle Titanic dataset.

The aim of this package is to allow those with little or no neural network coding experience to learn how different hyper-parameter combinations affect neural network training. The package also includes a pre-trained neural network for demonstrating how networks make predictions once trained.

Later versions will expand the package to contain more flexible interfaces and networks for other classic datasets, including image and text datasets with convolutional and recurrent neural networks.

Installation

You can install Titanicbc from PyPI


pip install Titanicbc


How to use


Titanicbc provides a simple interface for training and using pre-trained networks via the config.yaml file.

The config.yaml file is included in the Python site-packages folder for Titanicbc. To find the python site-packages on your machine run python -m site. Once in site-packages, select the Titanicbc folder.

Once hyper-parameters have been set using config.yaml, simply run python -m Titanicbc from the command line or terminal to train a network or make predictions (depending on the value of train_new in config.yaml). The accuracy on a validation set for comparing models is displayed below the final epoch, above the prediction output and dataframe.

The predictions made by the new or existing model will be saved into the same location in site-packages/Titanicbc as output.csv. The output columns are in the Kaggle required format (the PassengerId and the prediction of whether that passenger survived).


The options for config.yaml are presented below in the following format;

option number. Key (value options)

  1. train_new (True, False) - If true, a new neural network will be trained and overwrite trained_model.pth. If False the model parameters saved in trained_model.pth will be loaded and used for predictions.

  2. hidden_dim (Integer) - Number of neurons in each of the 3 hidden layers within the network.

  3. num_epochs (Integer) - Number of passes the network will make over the training data when training a new model.

  4. learning_rate (float) - Parameter multiplied to the weight updates during stochastic gradient descent. Currently only the Adam optimiser is used.

  5. weight_init (uniform, xavier) - Tells the network which type of initialisation to use for the model weights. Xavier is currently recommended


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Titanicbc-1.1.2.tar.gz (5.7 kB view details)

Uploaded Source

Built Distribution

Titanicbc-1.1.2-py3-none-any.whl (42.8 kB view details)

Uploaded Python 3

File details

Details for the file Titanicbc-1.1.2.tar.gz.

File metadata

  • Download URL: Titanicbc-1.1.2.tar.gz
  • Upload date:
  • Size: 5.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.7.7

File hashes

Hashes for Titanicbc-1.1.2.tar.gz
Algorithm Hash digest
SHA256 7a2ed1c63c364b99ae97a99a75237a0062a856f7543ecd7f4115864d4aa2118d
MD5 57efd782ada4e9210e33e99d2eca1cb3
BLAKE2b-256 b22664aefbea808631f25eee853dde5e87c950c858804a251f8aeb79ae2cfcb7

See more details on using hashes here.

File details

Details for the file Titanicbc-1.1.2-py3-none-any.whl.

File metadata

  • Download URL: Titanicbc-1.1.2-py3-none-any.whl
  • Upload date:
  • Size: 42.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.7.7

File hashes

Hashes for Titanicbc-1.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 d0b1939e6dd0420c2e3998e44ad922573a908ede8f0bfedd167ecbfb49e0cb9e
MD5 da75e64c0d9573dddbb3874d5f8cd379
BLAKE2b-256 f88eefad4c6da41b960fcafe19bef8b227aca4454f0c544c63ea1ee7bad64fa5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page