Deep neural networks without the learning cliff! A wrapper library compatible with scikit-learn.
Deep neural network implementation without the learning cliff! This library implements multi-layer perceptrons as a wrapper for the powerful Lasagne library that’s compatible with scikit-learn for a more user-friendly and Pythonic interface.
NOTE: This project is possible thanks to the nucl.ai Conference on July 18-20. Join us in Vienna!
Thanks to the underlying Lasagne implementation, this library supports the following neural network features, which are exposed in an intuitive and well documented API:
- Activation Functions —
- Nonlinear: Sigmoid, Tanh, Rectifier.
- Linear: Linear, Gaussian, Softmax.
- Layer Types — Convolution (greyscale and color, 2D), Dense (standard, 1D).
- Learning Rules — sgd, momentum, nesterov, adadelta, adagrad, rmsprop.
- Regularization — L1, L2 and dropout.
- Dataset Formats — numpy.ndarray, scipy.sparse, coming soon: iterators.
If a feature you need is missing, consider opening a GitHub Issue with a detailed explanation about the use case and we’ll see what we can do.
To download and setup the latest official release, you can do so from PYPI directly:
> pip install scikit-neuralnetwork
This will install a copy of
Lasagne too as a dependency. We recommend you use a virtual environment for Python.
Then, you can run the tests using nosetests -v sknn, and other samples or benchmarks are available in the examples/ folder.
The library supports both regressors (to estimate continuous outputs from inputs) and classifiers (to predict labels from features). This is the sklearn-compatible API:
from sknn.mlp import Classifier, Layer nn = Classifier( layers=[ Layer("Rectifier", units=100), Layer("Linear")], learning_rate=0.02, n_iter=10) nn.fit(X_train, y_train) y_valid = nn.predict(X_valid) score = nn.score(X_test, y_test)