Skip to main content

Neural Networks Wrapper for TensorFlow

Project description

# nn-wtf - Neural Networks Wrapper for TensorFlow

nn-wtf aims at providing a convenience wrapper to Google's
[TensorFlow](http://www.tensorflow.org/) machine learning library.
Its focus is on making neural networks easy to set up, train and use.

The library is in pre-alpha right now and does not do anything seriously useful
yet.

## Installation

nn-wtf runs under Python3.4 and above.

### Dependencies

You need to install TensorFlow manually. The installation process depends on
your system. Install the version of TensorFlow built for Python 3.4.

See
https://www.tensorflow.org/versions/r0.8/get_started/os_setup.html#download-and-setup
for details.

Example installation for Linux without GPU support:
```
$ pip install --upgrade https://storage.googleapis.com/tensorflow/linux/cpu/tensorflow-0.8.0rc0-cp34-cp34m-linux_x86_64.whl
```

### NN-WTF itself
Simple:
```
$ pip install nn_wtf
```

## Documentation

Sorry the documentation is absolutely minimal at this point. More useful
documentation will be ready by the time this package reaches alpha status.

### List of useful classes and methods

* `NeuralNetworkGraph`: Base class for defining and training neural networks
* `__init__(self, input_size, layer_sizes, output_size, learning_rate)`
* `set_session(self, session=None)`
* `train(self, data_sets, max_steps, precision, steps_between_checks, run_as_check, batch_size)`
* `get_predictor().predict(input_data)`
* `MNISTGraph`: Subclass of NeuralNetworkGraph suitable for working on MNIST data
* `NeuralNetworkOptimizer`: Optimize geometry of a neural network for fast training
* `__init__( self, tested_network, input_size, output_size, training_precision,
layer_sizes, learning_rate, verbose, batch_size)`
* `brute_force_optimal_network_geometry(self, data_sets, max_steps)`

### Usage example

If you want to try it on MNIST data, try this sample program:

```python
from nn_wtf.mnist_data_sets import MNISTDataSets
from nn_wtf.mnist_graph import MNISTGraph

import tensorflow as tf

data_sets = MNISTDataSets('.')
graph = MNISTGraph(
learning_rate=0.1, layer_sizes=(64, 64, 16), train_dir='.'
)
graph.train(data_sets, max_steps=5000, precision=0.95)

image_data = MNISTDataSets.read_one_image_from_url(
'http://github.com/lene/nn-wtf/blob/master/nn_wtf/data/7_from_test_set.raw?raw=true'
)
prediction = graph.get_predictor().predict(image_data)
assert prediction == 7
```

>From there on, you are on your own for now. More functionality and documentation
to come.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nn_wtf-0.1.7.linux-x86_64.tar.gz (81.9 kB view details)

Uploaded Source

Built Distributions

nn_wtf-0.1.7-py3.5.egg (105.5 kB view details)

Uploaded Source

nn_wtf-0.1.7-py2-none-any.whl (52.7 kB view details)

Uploaded Python 2

File details

Details for the file nn_wtf-0.1.7.linux-x86_64.tar.gz.

File metadata

File hashes

Hashes for nn_wtf-0.1.7.linux-x86_64.tar.gz
Algorithm Hash digest
SHA256 65741a9c8c40fa44bd2c356984729e1e22dbc502606ee4f945e0cae1cfb1db33
MD5 6bf4a3ea9afa8121feda63e50cd45c31
BLAKE2b-256 ee0834b649abd41943a936b02648491f23f8bb8b8d74b7b76353681173e81c25

See more details on using hashes here.

File details

Details for the file nn_wtf-0.1.7-py3.5.egg.

File metadata

  • Download URL: nn_wtf-0.1.7-py3.5.egg
  • Upload date:
  • Size: 105.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for nn_wtf-0.1.7-py3.5.egg
Algorithm Hash digest
SHA256 a4443d33d0776ce3b4e96bdc1c3937eb5a4fa0ab945917d58d57477a73bcada5
MD5 fe930eb02fac6ae8332606be5d0bb768
BLAKE2b-256 ad81cbf10c9ac2559eb7eaf3c7611934c11ba0910f06828c0ce0454805be264c

See more details on using hashes here.

File details

Details for the file nn_wtf-0.1.7-py2-none-any.whl.

File metadata

File hashes

Hashes for nn_wtf-0.1.7-py2-none-any.whl
Algorithm Hash digest
SHA256 b40a7ef10a989ef1bda72bc2d0c4550c330fe60b87539fc395b420bd6ace8366
MD5 505a5813c707cc0f3e2e96fa2ca3c94b
BLAKE2b-256 abb6e5304b0580e2bbff398f5a24192c72ce435ed3371e8ea8334cf8acd9ba93

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page