Skip to main content

Python library for Variants of Support Vector Machines

Project description

PyPi License Python3 downloads downloads_month DOI

Variant-SVMs

./logo/logo_header.png

VarSVM is a Python scikit-learn estimators module for solving variants Support Vector Machines (SVM).

Website: https://variant-svm.readthedocs.io

This project was created by Ben Dai. If there is any problem and suggestion please contact me via <bdai@umn.edu>.

Installation

Dependencies

varsvm requires:

  • Python

  • NumPy

  • Pandas

  • Sklearn

User installation

Install Variant-SVMs using pip

pip install varsvm

or

pip install git+https://github.com/statmlben/varsvm.git

Please install python3-dev before install varsvm

sudo apt-get install python3-dev

Source code

You can check the latest sources with the command:

git clone https://github.com/statmlben/varsvm.git

Documentation

weightsvm

Classical weighted SVMs.

  • class VarSVM.weightsvm(alpha=[], beta=[], C=1., max_iter = 1000, eps = 1e-4, print_step = 1)

    • Parameters:
      • alpha: Dual variable.

      • beta: Primal variable, or coefficients of the support vector in the decision function.

      • C: Penalty parameter C of the error term.

      • max_iter: Hard limit on iterations for coordinate descent.

      • eps: Tolerance for stopping criterion based on the relative l1 norm for difference of beta and beta_old.

      • print_step: If print the interations for coordinate descent, 1 indicates YES, 0 indicates NO.

    • Methods:
      • decision_function(X): Evaluates the decision function for the samples in X.
        • X : array-like, shape (n_samples, n_features)

      • fit(X, y, sample_weight=1.): Fit the SVM model.
        • X : {array-like, sparse matrix}, shape (n_samples, n_features)

        • y : array-like, shape (n_samples,) NOTE: y must be +1 or -1!

        • sample_weight : array-like, shape (n_samples,), weight for each sample.

Drift SVM

SVM with dift or fixed intercept for each instance.

  • class VarSVM.driftsvm(alpha=[], beta=[], C=1., max_iter = 1000, eps = 1e-4, print_step = 1)

    • Parameters:
      • alpha: Dual variable.

      • beta: Primal variable, or coefficients of the support vector in the decision function.

      • C: Penalty parameter C of the error term.

      • max_iter: Hard limit on iterations for coordinate descent.

      • eps: Tolerance for stopping criterion based on the relative l1 norm for difference of beta and beta_old.

      • print_step: If print the interations for coordinate descent, 1 indicates YES, 0 indicates NO.

    • Methods:
      • decision_function(X): Evaluates the decision function for the samples in X.
        • X : array-like, shape (n_samples, n_features)

      • fit(X, y, drift, sample_weight=1.): Fit the SVM model.
        • X : {array-like, sparse matrix}, shape (n_samples, n_features)

        • y : array-like, shape (n_samples,). NOTE: y must be +1 or -1!

        • drift: array-like, shape (n_samples,), drift or fixed intercept for each instance, see doc.

        • sample_weight : array-like, shape (n_samples,), weight for each instance.

Non-negative Drift SVM

SVM with non-negative constrains for coefficients.

  • class VarSVM.noneg_driftsvm(alpha=[], beta=[], C=1., max_iter = 1000, eps = 1e-4, print_step = 1)

    • Parameters:
      • alpha: Dual variable.

      • beta: Primal variable, or coefficients of the support vector in the decision function.

      • C: Penalty parameter C of the error term.

      • max_iter: Hard limit on iterations for coordinate descent.

      • eps: Tolerance for stopping criterion based on the relative l1 norm for difference of beta and beta_old.

      • print_step: If print the interations for coordinate descent, 1 indicates YES, 0 indicates NO.

    • Methods:
      • decision_function(X): Evaluates the decision function for the samples in X.
        • X : array-like, shape (n_samples, n_features)

      • fit(X, y, drift, sample_weight=1.): Fit the SVM model.
        • X : {array-like, sparse matrix}, shape (n_samples, n_features)

        • y : array-like, shape (n_samples,). NOTE: y must be +1 or -1!

        • drift: array-like, shape (n_samples,), drift or fixed intercept for each instance, see doc.

        • sample_weight : array-like, shape (n_samples,), weight for each instance.

Example

import numpy as np
from sklearn.datasets import make_classification
from varsvm import noneg_driftsvm
from sklearn.model_selection import GridSearchCV

X, y = make_classification(n_features=4, random_state=0)
y = y * 2 - 1

# fit a single model
n = len(X)
drift = .28*np.ones(n)

clf = noneg_driftsvm()
clf.fit(X=X, y=y, drift=drift)
y_pred = clf.decision_function(X=X, drift=drift)

# Tuning hyperparams based on sklearn.model_selection.GridSearchCV
parameters = {'C':[1, 10]}
psvm = noneg_driftsvm()
clf = GridSearchCV(psvm, parameters)
clf.fit(iris.data, iris.target)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

varsvm-1.2.tar.gz (9.3 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page