No project description provided
Project description
mbgdregressor
This is a small library that tries to run mini batch gradient desent on your linear regression dataset which is not primitively implemented in sklearn library.
Install
pip install Mbgdregressor==1.0
Usage
>>> import numpy as np
>>> import Mbgdregressor
>>> from sklearn.datasets import load_diabetes
>>> X,y = load_diabetes(return_X_y=True)
>>> X_train,X_test,y_train,y_test = train_test_split(X,y,test_size=0.2,random_state=2)
>>> batch = X_train.shape[0]//20
>>> model = Mbgdregressor(batch_size= batch,learning_rate=0.01,epochs=100)
>>> model.fit(X_train,y_train)
>>> y_pred = model.predict(X_test)
Performance
The provided graph presents a comparative analysis of the learning curve or descent trajectory for both stochastic gradient descent and mini-batch gradient descent. A noticeable observation from the graph is that the curve does not exhibit a completely random behavior, as observed in stochastic gradient descent method, nor does it appear to be perfectly linear as in batch gradient descent.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Mbgdregressor-1.1.tar.gz
(14.8 kB
view hashes)
Built Distribution
Close
Hashes for Mbgdregressor-1.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 73e891ac7eab625694df435884cc943ca767708c4c040b18750b4b94582fc138 |
|
MD5 | e6908f08be10e964aa0809532fb9f31c |
|
BLAKE2b-256 | a0a87eb66f242d906b61706f620c685691f1ee6705c361f5f4fc7de03611e9a9 |