Skip to main content

No project description provided

Project description

mbgdregressor

This is a small library that tries to run mini batch gradient desent on your linear regression dataset which is not primitively implemented in sklearn library.

Install

pip install Mbgdregressor==1.0

Usage

>>> import numpy as np
>>> import Mbgdregressor
>>> from sklearn.datasets import load_diabetes

>>> X,y = load_diabetes(return_X_y=True)
>>> X_train,X_test,y_train,y_test = train_test_split(X,y,test_size=0.2,random_state=2)
>>> batch = X_train.shape[0]//20
>>> model = Mbgdregressor(batch_size= batch,learning_rate=0.01,epochs=100)
>>> model.fit(X_train,y_train)
>>> y_pred = model.predict(X_test)

Performance

The provided graph presents a comparative analysis of the learning curve or descent trajectory for both stochastic gradient descent and mini-batch gradient descent. A noticeable observation from the graph is that the curve does not exhibit a completely random behavior, as observed in stochastic gradient descent method, nor does it appear to be perfectly linear as in batch gradient descent.

Stochiastic gd Mini Batch gd

Project details


Release history Release notifications | RSS feed

This version

1.1

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Mbgdregressor-1.1.tar.gz (14.8 kB view hashes)

Uploaded Source

Built Distribution

Mbgdregressor-1.1-py3-none-any.whl (16.6 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page