Skip to main content

This is a Single layer Perceptron package

Project description

SINGLE LAYER PERCEPTRON :

The perceptron is a single processing unit of any neural network. Frank Rosenblatt first proposed in 1958 is a simple neuron which is used to classify its input into one or two categories. Perceptron is a linear classifier, and is used in supervised learning. It helps to organize the given input data.

A perceptron is a neural network unit that does a precise computation to detect features in the input data. Perceptron is mainly used to classify the data into two parts. Therefore, it is also known as Linear Binary Classifier.

The perceptron consists of 4 parts.

  • Input value or One input layer: The input layer of the perceptron is made of artificial input neurons and takes the initial data into the system for further processing.
  • Weights and Bias: Weight: It represents the dimension or strength of the connection between units. If the weight to node 1 to node 2 has a higher quantity, then neuron 1 has a more considerable influence on the neuron.
  • Bias: It is the same as the intercept added in a linear equation. It is an additional parameter which task is to modify the output along with the weighted sum of the input to the other neuron. Net sum: It calculates the total sum.
  • Activation Function: A neuron can be activated or not, is determined by an activation function. The activation function calculates a weighted sum and further adding bias with it to give the result.

how to use this Package :

Import these libraries :

from oneNeuron.perceptron import Perceptron
from oneNeuron.allutils import prepare_data
from oneNeuron.allutils import save_model

how to call Perceptron class :

model = Perceptron(eta, epochs)
model.fit(X, y)

A glance of the code :

class Perceptron:
  def __init__(self, eta, epochs):
    self.weights = np.random.randn(3) * 1e-4 # SMALL WEIGHT INIT
    logging.info(f"initial weights before training: \n{self.weights}")
    self.eta = eta # LEARNING RATE
    self.epochs = epochs 


  def activationFunction(self, inputs, weights):
    z = np.dot(inputs, weights) # z = W * X
    return np.where(z > 0, 1, 0) # CONDITION, IF TRUE, ELSE

  def fit(self, X, y):
    self.X = X
    self.y = y

    X_with_bias = np.c_[self.X, -np.ones((len(self.X), 1))] # CONCATINATION
    logging.info(f"X with bias: \n{X_with_bias}")

    for epoch in tqdm(range(self.epochs), total=self.epochs, desc="Training the Model"):
      logging.info("--"*10)
      logging.info(f"for epoch: {epoch}")
      logging.info("--"*10)

      y_hat = self.activationFunction(X_with_bias, self.weights) # foward propagation
      logging.info(f"predicted value after forward pass: \n{y_hat}")
      self.error = self.y - y_hat
      logging.info(f"error: \n{self.error}")
      self.weights = self.weights + self.eta * np.dot(X_with_bias.T, self.error) # backward propagation
      logging.info(f"updated weights after epoch:\n{epoch}/{self.epochs} : \n{self.weights}")
      logging.info("#####"*10)


  def predict(self, X):
    X_with_bias = np.c_[X, -np.ones((len(X), 1))]
    return self.activationFunction(X_with_bias, self.weights)

  def total_loss(self):
    total_loss = np.sum(self.error)
    logging.info(f"total loss: {total_loss}")
    return total_loss

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

oneNeuron_pypi-kkkumar2-0.0.5.tar.gz (4.2 kB view details)

Uploaded Source

Built Distribution

oneNeuron_pypi_kkkumar2-0.0.5-py3-none-any.whl (5.3 kB view details)

Uploaded Python 3

File details

Details for the file oneNeuron_pypi-kkkumar2-0.0.5.tar.gz.

File metadata

  • Download URL: oneNeuron_pypi-kkkumar2-0.0.5.tar.gz
  • Upload date:
  • Size: 4.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7

File hashes

Hashes for oneNeuron_pypi-kkkumar2-0.0.5.tar.gz
Algorithm Hash digest
SHA256 9738382506fc03e8b7fa1661054255472bda38bb9f1476add7ca142de638c552
MD5 07cb515f9bd6d59810a395b41f4f1f82
BLAKE2b-256 6895ae26eb6cb374d4c3385af2ac251c75777ab85815205aec2df4abb7ec9a41

See more details on using hashes here.

File details

Details for the file oneNeuron_pypi_kkkumar2-0.0.5-py3-none-any.whl.

File metadata

  • Download URL: oneNeuron_pypi_kkkumar2-0.0.5-py3-none-any.whl
  • Upload date:
  • Size: 5.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.8.1 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.3 CPython/3.9.7

File hashes

Hashes for oneNeuron_pypi_kkkumar2-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 adff6c55b4ef132d82827c6bc01062b029b72d476429554c80cf3775d633ad42
MD5 7b4848672a0bdac0447cc9c03cb099f7
BLAKE2b-256 83baa8989d82e83f124a6bbd8774c10b6805f2f93fd3d3035c60898062a3990c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page