A pure-Python neural network library
MyNN is a simple NumPy-centric neural network library that builds on top of MyGrad. It provides convenient wrappers for such functionality as
- Convenient neural network layers (e.g. convolutional, dense, batch normalization, dropout)
- Weight initialization functions (e.g. Glorot, He, uniform, normal)
- Neural network activation functions (e.g. elu, glu, tanh, sigmoid)
- Common loss functions (e.g. cross-entropy, KL-divergence, Huber loss)
- Optimization algorithms (e.g. sgd, adadelta, adam, rmsprop)
MyNN comes complete with several examples to ramp you up to being a fluent user of the library. It was written as an extension to MyGrad for rapid prototyping of neural networks with minimal dependencies, a clean codebase with excellent documentation, and as a learning tool.
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.