A package for Multiple Kernel Learning scikit-compliant

MKLpy is a framework for Multiple Kernel Learning and kernel machines scikit-compliant.

This package contains:

- MKL algorithms * EasyMKL * RM-GD * R-MKL * Average of kernels
- a meta-MKL-classifier used in multiclass problems according to one-vs-one pattern;
- tools to operate over kernels, such as normalization, centering, summation, mean…;
- metrics, such as kernel_alignment, radius…;
- kernel functions, such as HPK and boolean kernels (disjunctive, conjunctive, DNF, CNF).

For more informations about classification, kernels and predictors visit Link scikit-learn

## requirements

To work properly, MKLpy requires:

- numpy
- scikit-learn
- cvxopt

## examples

**LOADING data**

It is possible to load data by using scikit-learn, exploiting the svmlight standard

from sklearn.datasets import load_svmlight_file X,Y = load_svmlight_file(path) X = X.toarray() #Important! MKLpy require dense matrices!

**PREPROCESSING**

MKLpy provides several tools to preprocess data, some examples are:

from MKLpy.regularization import normalization,rescale_01 X = rescale_01(X) X = normalization(X)

It is also possible to operate on kernels directly

from MKLpy.metrics.pairwise import HPK_kernel K = HPK_kernel(X,degree=2) from MKLpy.regularization import kernel_centering, kernel_normalization, tracenorm Kc = kernel_centering(K) Kn = kernel_normalization(K) Kt = tracenorm(K)

**GENERATION**

MKL algorithms require list or arrays of kernels, it is possible to create any custom list

KL = [HPK_kernel(X,degree=d) for d in range(1,11)] #creating lists of boolean kernels from MKLpy.metrics.pairwise import monotone_conjunctive_kernel as mCK, monotone_disjunctive_kernel as mDK #WARNING: boolean kernels require binary valued data {0,1} KL = [mCK(X,k=d) for d in range(1,11)] + [mDK(X,k=d) for d in range(2,11)]

**LEARNING**

The learning phase consists on two steps: learning kernels and fit models by using a MKl algorithm and a standard kernel machine

from MKLpy.algorithms import EasyMKL,RMGD,RMKL,AverageMKL #learn kernels K_easy = EasyMKL(lam=0.1).arrange_kernel(KL,Y) K_rmgd = RMGD(max_iter=3).arrange_kernel(KL,Y) #fit models from sklearn.svm import SVC from MKLpy.algorithms import KOMD clf_komd = KOMD(lam=0.1,kernel='precomputed').fit(K_easy,Y) clf_svc = SVC(C=10,kernel='precomputed').fit(K_rmgd,Y)

Now, we show a more suitable procedure, where MKL algorithms use a default base learner

clf = EasyMKL().fit(KL,Y) clf = AverageMKL().fit(KL,Y)

It is also possible to set a custom base learner

clf = EasyMKL(estimator=SVC(C=1)).fit(KL,Y)

**EVALUATION**

It is possible to evaluate a model by splitting a kernels list in train and test

from MKLpy.model_selection import train_test_split, cross_val_score from sklearn.metrics import roc_auc_score KLtr,KLte,Ytr,Yte = train_test_split(KL,Y,train_size=.75,random_state=42) y_score = clf.fit(KLtr,Ytr).decision_function(KLte) auc_score = roc_auc_score(Yte, y_score)

Or using a cross-validation procedure

clf = EasyMKL(estimator=SVC()) scores = cross_val_score(KL,Y,estimator=clf,n_folds=5)

**OTHER TOOLS**

MKLpy contains a wide set of tools for kernel learning and MKL, a simple example:

from MKLpy.metrics import margin, radius K = AverageMKL().arrange_kernel(KL,Y) rho = margin(K,Y) #distance between classes R = radius(K) #radius of MEB

## Release History

## Download Files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

File Name & Checksum SHA256 Checksum Help | Version | File Type | Upload Date |
---|---|---|---|

MKLpy-0.2.1b0.tar.gz (23.1 kB) Copy SHA256 Checksum SHA256 | – | Source | Jul 29, 2017 |