Skip to main content

Multilayer perceptron network implementation in Python

Project description

PyPI version Build Status

Perceptron implements a multilayer perceptron network written in Python. This type of network consists of multiple layers of neurons, the first of which takes the input. The last layer gives the ouput. There can be multiple middle layers but in this case, it just uses a single one.

For further information about multilayer perceptron networks, please read this entry on the Wikipedia.

Requirements

Installation

You can install the package via easy_install or pip:

easy_install perceptron
pip install perceptron

Feeding Forward

The neural network uses the hyperbolic tangent (tanh) function.

Hyperbolic tangent

Hyperbolic tangent

The x-axis is the total input to the node. As the input aproaches to 0, the output starts to climb quickly. With an input of 2, the output is almos at 1 and doesn’t get much higher. This is a type of sigmoid functions to calculate the output of the neurons.

Note: Before runing the feedforward algorithm, the network will have to query the nodes and connections, and build, in memory, the position of the network that is relevant to a specific input.

Training with Backpropagation

The backpropagation algorithm then performs the following steps.

For each node in the output layer:

  1. Calculate the difference between the node’s current output and what it should be.

  2. Use the dtanh function to determine how much the node’s total input has to change.

  3. Change the strenght of every inconming input in proportion to the input’s current strength and the learning rate.

For each node in the hidden layer:

  1. Change the output of the node by the sum of the strength of each output value multiplied by how much its targets has to change.

  2. Use the dtanh function to determine how much the node’s total input has to change.

  3. Change the strenght of every inconming input in proportion to the input’s current strength and the learning rate.

The implementation of this algorithm actually calculates all the errors in advance and then adjusts the weigths, because all the calculations rely on knowing the current weights rather than the updated weights.

Note: Before runing backpropagation method, it’s necessary to run feedforward so that the current output of every node will be stored in the instance variables.

Usage

Import the module at the beginning of your file:

from perceptron import mlp

Init the neural network:

n = mlp.Net()

Example

In this example the neurons in the first layer respont to the ids that are used as input. If a id is present, then the neurons that are strongly connected to that word become active. The second layer is fed by the first layer, so it responds to combinations of ids. Finally, the neurons feed their result to the outputs, and particular combinations may be strongly or weakly associated with the possible results. In the end, the final decision is whichever output is strongest classifying an id.

from perceptron import mlp

def main():
  n = mlp.Net()

  for i in range(30):
    n.train([101,103],[201,202,203],201)
    n.train([102,103],[201,202,203],202)
    n.train([101],[201,202,203],203)

  print n.eval([101,103,],[201,202,203])
  print n.eval([102,103],[201,202,203])
  print n.eval([103],[201,202,203])

if  __name__=='__main__': main()

That will give the following output.

[0.8435967735300776, 0.011059223531796199, 0.017992770688108367]
[-0.028282207517584094, 0.8775955174169334, 0.0032322039490162353]
[0.8459277961565395, -0.011590385221469553, -0.8361964445052618]

Licence

Copyright © 2016 Roger Fernandez Guri. It is free software, and may be redistributed under the terms specified in the LICENCE file.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

perceptron-1.1.0.tar.gz (6.5 kB view details)

Uploaded Source

Built Distribution

perceptron-1.1.0-py2-none-any.whl (10.3 kB view details)

Uploaded Python 2

File details

Details for the file perceptron-1.1.0.tar.gz.

File metadata

  • Download URL: perceptron-1.1.0.tar.gz
  • Upload date:
  • Size: 6.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for perceptron-1.1.0.tar.gz
Algorithm Hash digest
SHA256 b2ac4704da9c8d9e026c87aa4b8908ffaf174d98fe8bce3548bfb79ffc868635
MD5 5f45bb029e5adc022d937c11e4df1222
BLAKE2b-256 7e8f8b5f865313b21a0c0c2397df8261a0d5ef5b333c8d3f7151f71294fedd96

See more details on using hashes here.

File details

Details for the file perceptron-1.1.0-py2-none-any.whl.

File metadata

File hashes

Hashes for perceptron-1.1.0-py2-none-any.whl
Algorithm Hash digest
SHA256 287f467383a9cd90917a13ce3a16b6d5cf75943ae0f555db4c0072fbc2c4ca87
MD5 8f565ec8669d531df742e5a2c762ad88
BLAKE2b-256 f4221e8e4d250d95de41422a1c01da3b911e15b4032a11d4677e194d3f6e992b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page