Skip to main content

A pure-Python neural network library

Project description

MyNN is a simple NumPy-centric neural network library that builds on top of MyGrad. It provides convenient wrappers for such functionality as

  • Convenient neural network layers (e.g. convolutional, dense, batch normalization, dropout)
  • Weight initialization functions (e.g. Glorot, He, uniform, normal)
  • Neural network activation functions (e.g. elu, glu, tanh, sigmoid)
  • Common loss functions (e.g. cross-entropy, KL-divergence, Huber loss)
  • Optimization algorithms (e.g. sgd, adadelta, adam, rmsprop)

MyNN comes complete with several examples to ramp you up to being a fluent user of the library. It was written as an extension to MyGrad for rapid prototyping of neural networks with minimal dependencies, a clean codebase with excellent documentation, and as a learning tool.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mynn-0.9.4.tar.gz (31.6 kB view hashes)

Uploaded source

Built Distributions

mynn-0.9.4-py3.9.egg (53.4 kB view hashes)

Uploaded 0 9 4

mynn-0.9.4-py3-none-any.whl (24.1 kB view hashes)

Uploaded py3

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Huawei Huawei PSF Sponsor Microsoft Microsoft PSF Sponsor NVIDIA NVIDIA PSF Sponsor Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page