Skip to main content

Simple kNN algorithm with k-Fold Cross Validation

Project description

simple-kNN

pypi ci

This repository is for Continuous Integration of my simple k-Nearest Neighbors (kNN) algorithm to pypi package.

For notebook version please visit this repository

k-Nearest Neighbors

k-Nearest Neighbors, kNN for short, is a very simple but powerful technique used for making predictions. The principle behind kNN is to use “most similar historical examples to the new data.”

k-Nearest Neighbors in 4 easy steps

  • Choose a value for k
  • Find the distance of the new point to each record of training data
  • Get the k-Nearest Neighbors
  • Making Predictions
    • For classification problem, the new data point belongs to the class that most of the neighbors belong to.
    • For regression problem, the prediction can be average or weighted average of the label of k-Nearest Neighbors

Finally, we evaluate the model using k-Fold Cross Validation technique

k-Fold Cross Validation

This technique involves randomly dividing the dataset into k-groups or folds of approximately equal size. The first fold is kept for testing and the model is trained on remaining k-1 folds.

Installation

pip install simple-kNN

Usage

from simple_kNN.distanceMetrics import distanceMetrics
from simple_kNN.kFoldCV import kFoldCV
from simple_kNN.kNNClassifier import kNNClassifier

References

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

simple_kNN-1.1.6.tar.gz (137.8 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page