Skip to main content

A Keras callback package for iteratively selecting the most influential input nodes during training.

Project description

SpecNet

SpecNet is a python package that enables feature selection algorithms embedded in a neural network architecture. It combines a leave-one-out cross-validation (LOOCV) type of feature selection algorithm with recursive pruning of the input nodes, such that only the most relevant nodes with the richest information are kept for the subsequent optimization task. The recursive pruning is undertaken by employing a FeatureSelection callback at certain points of the optimization process. The precise procedure is explained in Sequence of Events. Originally developed for serving the task of finding biomarkers in biological tissues, the algorithm is generically coded such that it is able to select features for all kinds of classification tasks.

The package is an extension for the keras and tensorflow libraries. Please see the links for further information on their software packages and to get a grasp of neural networks in general and the constructs used for SpecNet.

Installation

It is best at the moment to install this as an external package with pip. This can be done by cloning the repository with the following commands:

git clone https://github.tik.uni-stuttgart.de/FelixFischer/FeaSel-Net.git feasel-net
cd feasel-net
pip install -e .

Sequence of Events

  1. Initiallizing Neural Network The first step of the algorithm can be thought of a simple optimization problem initiallized with the inputs and a binary mask for those inputs with only ones as its entries. This behaviour is induced by using a newly created layer type called LinearPass.

    Initiallization

  2. Training until trigger conditions are met The neural network optimizes the classification results until one of the following options happen:

    • the training (or validation) loss value is beneath a certain threshold
    • the training (or validation) accuracy value is above a certain threshold Then - for the sake of consistency - it will count how many times in a row the conditions are met. If this happens for multiple epochs, the actual pruning event will start that consists of estimating the importance and eliminating uninformative features.
  3. Importance estimation As soon as the callback is triggered, the evaluation of the

    Evaluation

Release Information

0.0.1 - Initial Release

  • callback FeatureSelection
    • trigger parameters: delta epochs, thresholds, ...
    • different metrics for triggering
    • etc.
  • layer LinearPass

ToDos

Until now, only dense layered architectures are supported. The plan is to also include convolutional layers.

[x] DenseLayer support

[x] accuarcy and loss based evaluation

[ ] ConvLayer support

[ ] intermediate layers shall be supported

[ ] paper on algorithm

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

FeaSel-Net-0.0.4.tar.gz (87.1 kB view details)

Uploaded Source

File details

Details for the file FeaSel-Net-0.0.4.tar.gz.

File metadata

  • Download URL: FeaSel-Net-0.0.4.tar.gz
  • Upload date:
  • Size: 87.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.4

File hashes

Hashes for FeaSel-Net-0.0.4.tar.gz
Algorithm Hash digest
SHA256 7b2709b3cafd422c4760d56d63b44fda5f33a10ff77a45b8fe1c908eeaf62ded
MD5 0dcc10a7561f59a79aab14eeaecc392c
BLAKE2b-256 09e0f79b0d67b8c90edf159ed68ed96d8c73ce82d0bbda4a0b48c94527880a7a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page