Skip to main content

This is a ANN using Tensorflow package

Project description

ANN - Implementation using tensorflow

INTRODUCTION In this article, I will explain to you the basics of neural networks and their code. Nowadays many students just learn how to code for neural networks without understanding the core concepts behind it and how it internally works. First, Understand what is Neural Networks?

What is Neural Network? Neural Network is a series of algorithms that are trying to mimic the human brain and find the relationship between the sets of data. It is being used in various use-cases like in regression, classification, Image Recognition and many more.

As we have talked above that neural networks tries to mimic the human brain then there might be the difference as well as the similarity between them. Let us talk in brief about it.

Some major differences between them are biological neural network does parallel processing whereas the Artificial neural network does series processing also in the former one processing is slower (in millisecond) while in the latter one processing is faster (in a nanosecond).

Architecture Of ANN A neural network has many layers and each layer performs a specific function, and as the complexity of the model increases, the number of layers also increases that why it is known as the multi-layer perceptron.

The purest form of a neural network has three layers input layer, the hidden layer, and the output layer. The input layer picks up the input signals and transfers them to the next layer and finally, the output layer gives the final prediction and these neural networks have to be trained with some training data as well like machine learning algorithms before providing a particular problem. Now, let’s understand more about perceptron.

About Perceptron As discussed above multi-layered perceptron these are basically the hidden or the dense layers. They are made up of many neurons and neurons are the primary unit that works together to form perceptron. In simple words, as you can see in the above picture each circle represents neurons and a vertical combination of neurons represents perceptrons which is basically a dense layer.

About Perceptron ANN

Now in the above picture, you can see each neuron’s detailed view. Here, each neurons have some weights (in above picture w1, w2, w3) and biases and based on this computations are done as, combination = bias + weights * input (F = w1x1 + w2x2 + w3*x3) and finally activation function is applied output = activation(combination) in above picture activation is sigmoid represented by 1/(1 + e-F). There are some other activation functions as well like ReLU, Leaky ReLU, tanh, and many more.

Working Of ANN At First, information is feed into the input layer which then transfers it to the hidden layers, and interconnection between these two layers assign weights to each input randomly at the initial point. and then bias is added to each input neuron and after this, the weighted sum which is a combination of weights and bias is passed through the activation function. Activation Function has the responsibility of which node to fire for feature extraction and finally output is calculated. This whole process is known as Foreward Propagation. After getting the output model to compare it with the original output and the error is known and finally, weights are updated in backward propagation to reduce the error and this process continues for a certain number of epochs (iteration). Finally, model weights get updated and prediction is done.

How to use this Module

The Coding is done in a way that u don't have to build the code, u just need to change the data in the configuration file(yaml)

A glimpse of what is present in the configuration file is mentioned below

params:
  epochs: 5
  batch_size: 32
  no_classes: 10
  input_shape: [28,28]
  loss_function: sparse_categorical_crossentropy
  metrics: accuracy
  optimizer: SGD
  validation_datasize: 5000



artifacts:
  artifacts_dir: artifacts
  model_dir: model
  plots_dir: plots
  checkoint_dir: checkpoints
  model_name: model.h5
  plots_name: plot.png

logs:
  logs_dir: logs_dir
  general_logs: general_logs
  tensorboard_logs: tensorboard_logs

A glimpse of the Layers

    LAYERS = [
          tf.keras.layers.Flatten(input_shape=[28,28], name="inputlayer"),
          tf.keras.layers.Dense(300,activation="relu", name="hiddenlayer1"),
          tf.keras.layers.Dense(100,activation="relu", name="hiddenlayer2"),
          tf.keras.layers.Dense(OUTPUT_CLASSES,activation="softmax", name="outputlayer")] 

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ANN---Implementation-kkkumar2-0.0.2.tar.gz (8.1 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file ANN---Implementation-kkkumar2-0.0.2.tar.gz.

File metadata

  • Download URL: ANN---Implementation-kkkumar2-0.0.2.tar.gz
  • Upload date:
  • Size: 8.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7

File hashes

Hashes for ANN---Implementation-kkkumar2-0.0.2.tar.gz
Algorithm Hash digest
SHA256 b9461555d9df639f32a1d70cbc05efae5e71ba9406fe08ce937b8e2a9e3d854a
MD5 bb88a4fb4dc722057e4e18ead8725805
BLAKE2b-256 b8ea95512580144d0363326a4df1bc78d145c3700be0554bda417924189d9090

See more details on using hashes here.

File details

Details for the file ANN_Implementation_kkkumar2-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: ANN_Implementation_kkkumar2-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 9.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7

File hashes

Hashes for ANN_Implementation_kkkumar2-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 ae1f2ad0f55b3c75381800b340ccde0a350bb2d3fedcdde8ec362e3de257c303
MD5 1271ea50d50177cabe6b78073b87a56d
BLAKE2b-256 4575a2d2fd259b9ea26fff8b2cc6601cc766001e712e59ff658bfb6ff474d48b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page