Skip to main content

A Keras callback package for iteratively selecting the most influential input nodes during training.

Project description

SpecNet

SpecNet is a python package that enables feature selection algorithms embedded in a neural network architecture. It combines a leave-one-out cross-validation (LOOCV) type of feature selection algorithm with recursive pruning of the input nodes, such that only the most relevant nodes with the richest information are kept for the subsequent optimization task. The recursive pruning is undertaken by employing a FeatureSelection callback at certain points of the optimization process. The precise procedure is explained in Sequence of Events. Originally developed for serving the task of finding biomarkers in biological tissues, the algorithm is generically coded such that it is able to select features for all kinds of classification tasks.

The package is an extension for the keras and tensorflow libraries. Please see the links for further information on their software packages and to get a grasp of neural networks in general and the constructs used for SpecNet.

Installation

It is best at the moment to install this as an external package with pip. This can be done by cloning the repository with the following commands:

git clone https://github.tik.uni-stuttgart.de/FelixFischer/FeaSel-Net.git feasel-net
cd feasel-net
pip install -e .

Sequence of Events

  1. Initiallizing Neural Network The first step of the algorithm can be thought of a simple optimization problem initiallized with the inputs and a binary mask for those inputs with only ones as its entries. This behaviour is induced by using a newly created layer type called LinearPass.

    Initiallization

  2. Training until trigger conditions are met The neural network optimizes the classification results until one of the following options happen:

    • the training (or validation) loss value is beneath a certain threshold
    • the training (or validation) accuracy value is above a certain threshold Then - for the sake of consistency - it will count how many times in a row the conditions are met. If this happens for multiple epochs, the actual pruning event will start that consists of estimating the importance and eliminating uninformative features.
  3. Importance estimation As soon as the callback is triggered, the evaluation of the

    Evaluation

Release Information

0.0.1 - Initial Release

  • callback FeatureSelection
    • trigger parameters: delta epochs, thresholds, ...
    • different metrics for triggering
    • etc.
  • layer LinearPass

ToDos

Until now, only dense layered architectures are supported. The plan is to also include convolutional layers.

[x] DenseLayer support

[x] accuarcy and loss based evaluation

[ ] ConvLayer support

[ ] intermediate layers shall be supported

[ ] paper on algorithm

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

FeaSel-Net-0.0.3.tar.gz (3.9 kB view details)

Uploaded Source

File details

Details for the file FeaSel-Net-0.0.3.tar.gz.

File metadata

  • Download URL: FeaSel-Net-0.0.3.tar.gz
  • Upload date:
  • Size: 3.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.4

File hashes

Hashes for FeaSel-Net-0.0.3.tar.gz
Algorithm Hash digest
SHA256 1fa2def70683567aeb6b40f7c2ca88897c71935dcceeca38ac7ac4b65178f881
MD5 0b25d64399ae4d0c6d64a323bddb7bda
BLAKE2b-256 ad4d22a9cc63ba2529e606c6f3221ec991cbeb82d9f1629726598b0348483247

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page