Skip to main content

Deep Neural Network.

Project description


Implementation of Deep Neural Network with numpy. Now dnnet can run with GPU through cupy.

dnnet provides high-level API to define and run neural network model. User can turn on/off GPU layer-wise, that is, you can compute convolution layer with GPU, activation layer with CPU, and dropout layer with CPU, for example.

Table of Contents

  • Brief tour of dnnet; Introduce small examples, supported methodologies
  • Installation
  • Example; Run sample scripts
  • Use in your project

Brief tour of dnnet

Quick glance of usage

User can creates instance of NeuralNetwork, add layer one by one,
finalize model, set optimizer, execute model fitting, and save model.

In the below, some arguments are not specified to simplify the example.

from dnnet.neuralnet import NeuralNetwork
from import AdaGrad
from import DefaultInitialization, He
from import MultinomialCrossEntropy
from dnnet.layers.activation import Activation, ActivationLayer
from dnnet.layers.affine import AffineLayer
from dnnet.layers.batch_norm import BatchNormLayer
from dnnet.layers.convolution import ConvolutionLayer
from dnnet.layers.dropout import DropoutLayer

# Load x, y here

model = NeuralNetwork(input_shape=(1, 28, 28), dtype=np.float32)

model.add(ConvolutionLayer(filter_shape=(32, 3, 3))


optimizer = AdaGrad(learning_rate=1e-3, weight_decay=1e-3)
learning_curve =
    x=x, y=y, epochs=5, batch=size=100, optimizer=optimizer,
    loss_function=MultinomialCrossEntropy())'./data/output', name='my_cnn.dat')

User can also load model, and predict output.

model.load(path='./data/output', name='my_cnn.dat')
y = model.predict(x)

GPU is easily enabled. Do the follows at the top of your script.

from dnnet.config import Config

If GPU is enabled but you'd like to turn it off for some specific layers, you can use force_cpu flag. Here, ConvolutionLayer and AffineLayer don't have the flag.

from dnnet.config import Config

# Do something here.

# AffineLayer uses GPU.
model.add(AffineLayer(output=512, weight_initialization=He()))
# BatchNormLayer uses CPU regardless of Config.enable_gpu().

Supported Methods


  • Affine
  • Convolution
  • Activation
  • Pool
  • Batch Normalization
  • Dropout

Activation Functions

  • Sigmoid
  • ReLU
  • ELU
  • Tanh
  • Softmax

Optimization Methods

  • SGD
  • Momentum
  • AdaGrad
  • Adam
  • AdaDelta
  • RMSProp

Weight Initialization Methods

  • Xavier's method
  • He's method
  • Default

Loss Functions

  • MultinomialCrossEntropy for multinomial classification.
  • BinomialCrossEntropy for binary classification.
  • SquaredError for regression.



  • python 3.4 or later
  • numpy 1.12.0 or later
  • matplotlib

If you'd like to use GPU, you need to install the follows additionally.

  • CUDA (eg. CUDA 10.0)
  • CuDNN (eg. CuDNN7.6.5)
  • cupy (eg. cupy-cuda100==7.0.0)

Install dnnet by pip.

pip install dnnet

Install dnnet from source.

dnnet doesn't require any complicated path-settings.
You just download scripts from github, place it wherever you like,
and you add some lines like below in your scripts.

import sys

from dnnet.neuralnet import NeuralNetwork

Setup environment from scratch (Optional)

In this section, setting up python environment from scratch is described.
"From scratch" means that you're supposed to use brand-new computer,
no python packages (even python itself!) and relevant libraries are installed.

It may also be useful when you start new python project. In this case,
you will partially execute the following steps.

Setup Python Virtual Environment


  • Use python3
  • Make directory for pyenv in "/home/<user-name>/Documents"
  • Root directory of your python virtual env is in "/home/<user-name>/Work/py352_ws"
  • "/home/<user-name>/Work/py352_ws/" is your working directory

Setup procedure

  • Install required packages
$ sudo apt-get install git gcc make openssl libssl-dev libbz2-dev libreadline-dev libsqlite3-dev
  • Install tkinter(This is required to use matplotlib in virtualenv)
$ sudo apt-get install python3-tk python-tk tk-dev
  • Install pyenv
   $ cd ~
   $ git clone git:// ./pyenv
   $ mkdir -p ./pyenv/versions ./pyenv/shims
  • Set paths Add the following description in ~/.bashrc
export PYENV_ROOT=${HOME}/Documents/pyenv
if [ -d "${PYENV_ROOT}" ]; then
  export PATH=${PYENV_ROOT}/bin:$PATH
  eval "$(pyenv init -)"

And then execute the follows.

   $ exec $SHELL -l
   $ . ~/.bashrc
  • Install pyenv-virtualenv
   $ cd $PYENV_ROOT/plugins
   $ git clone git://
  • Install python 3.5.2
   $ pyenv install 3.5.2
  • Setup local pyenv
   $ mkdir -p ~/Work/py352_ws
   $ pyenv virtualenv 3.5.2 <name of this environment>

<name of this environment> can be like py352_env, python3_env, or anything you like.
Here, it's assumed that you named the environment as "py352_env".

   $ cd ~/Work/py352_ws
   $ pyenv local py352_env
   $ pip install --upgrade pip



  • Run neural network for mnist.
cd <path-to-dnnet>/examples/mnist

If you get an error "ImportError: Python is not installed as a framework.", it might be because of matplotlib issue.(This happened to me when working with MacOS.)

In the case, please try the follow.

cd ~/.matplotlib
echo "backend: TkAgg" >> matplotlibrc

Usage in your project

If you pip installed dnnet

from dnnet.neuralnet import NeuralNetwork

If you git cloned dnnet

import sys

from dnnet.neuralnet import NeuralNetwork

For example, if dnnet directory is in ~/Work/dnnet, do like below.

import os
import sys
sys.path.append(os.path.join(os.getenv('HOME'), 'Work/dnnet'))

from dnnet.neuralnet import NeuralNetwork

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for dnnet, version 0.10.1
Filename, size File type Python version Upload date Hashes
Filename, size dnnet-0.10.1-py2.py3-none-any.whl (34.1 kB) File type Wheel Python version py2.py3 Upload date Hashes View
Filename, size dnnet-0.10.1.tar.gz (25.6 kB) File type Source Python version None Upload date Hashes View

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring DigiCert DigiCert EV certificate Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page