Label distribution learning (LDL) and label enhancement (LE) toolkit implemented in python.
Project description
PyLDL
Label distribution learning (LDL) and label enhancement (LE) toolkit implemented in python, including:
-
LDL algorithms:
- (Geng, Yin, and Zhou 2013)[TPAMI]:
CPNN$^1$. - (Geng and Hou 2015)[IJCAI]:
LDSVR. - ⭐(Geng 2016)[TKDE]:
SA_BFGS,SA_IIS,AA_KNN,AA_BP,PT_Bayes, andPT_SVM. - (Yang, Sun, and Sun 2017)[AAAI]:
BCPNNandACPNN. - (Xu and Zhou 2017)[IJCAI]:
IncomLDL$^2$. - (Shen et al. 2017)[NeurIPS]:
LDLF. - (Wang and Geng 2019)[IJCAI]:
LDL4C$^3$. - (Shen et al. 2020)[南京理工大学学报 (Chinese)]:
AdaBoostLDL. - (González et al. 2021a)[Information Sciences]:
SSG_LDL$^4$. - (González et al. 2021b)[Information Fusion]:
DF_LDL. - (Wang and Geng 2021a)[IJCAI]:
LDL_HR$^3$. - (Wang and Geng 2021b)[ICML]:
LDLM$^3$. - (Jia et al. 2021)[TKDE]:
LDL_SCL. - (Jia et al. 2023)[TKDE]:
LDL_LRR. - (Wen et al. 2023)[ICCV]:
CAD$^1$,QFD2$^1$, andCJS$^1$.
- (Geng, Yin, and Zhou 2013)[TPAMI]:
-
LE algorithms:
- (Xu, Liu, and Geng 2019)[TKDE]:
FCM,KM,LP,ML, andGLLE. - (Xu et al. 2020)[ICML]:
LEVI. - (Zheng, Zhu, and Tang 2023)[CVPR]:
LIBLE.
- (Xu, Liu, and Geng 2019)[TKDE]:
-
LDL metrics:
chebyshev,clark,canberra,kl_divergence,cosine,intersection, etc. -
LDL datasets: Human_Gene, Movie, Natural_Scene, s-BU_3DFE, s-JAFFE, Yeast, etc.
$^1$ Technically, these methods are only suitable for totally ordered labels.
$^2$ These are algorithms for incomplete LDL, so you should use
utils.random_missingto generate the missing label distribution matrix and the corresponding mask matrix in the experiments.$^3$ These are LDL classifiers, so you should use
predict_probato get label distributions andpredictto get predicted labels.$^4$ These are oversampling algorithms for LDL, therefore you should use
fit_transformto generate synthetic samples.
Usage
Here is an example of using PyLDL.
from pyldl.utils import load_dataset
from pyldl.algorithms import SA_BFGS
from pyldl.metrics import score
from sklearn.model_selection import train_test_split
dataset_name = 'SJAFFE'
X, y = load_dataset(dataset_name)
X_train, X_test, y_train, y_test = train_test_split(X, y)
model = SA_BFGS()
model.fit(X_train, y_train)
y_pred = model.predict(X_test)
print(score(y_test, y_pred))
For those who would like to use the original implementation:
- Install MATLAB.
- Install MATLAB engine for python.
- Download LDL Package from here.
- Get the package directory of PyLDL (...\Lib\site-packages\pyldl).
- Place the LDLPackage_v1.2 folder into the matlab_algorithms folder.
Now, you can load the original implementation of the method, e.g.:
from pyldl.matlab_algorithms import SA_IIS
You can visualize the performance of any model on the artificial dataset (Geng 2016) with the pyldl.utils.plot_artificial function, e.g.:
from pyldl.algorithms import LDSVR, SA_BFGS, SA_IIS, AA_KNN, PT_Bayes, GLLE, LIBLE
from pyldl.utils import plot_artificial
methods = ['LDSVR', 'SA_BFGS', 'SA_IIS', 'AA_KNN', 'PT_Bayes', 'GLLE', 'LIBLE']
plot_artificial(model=None, figname='GT')
for i in methods:
plot_artificial(model=eval(f'{i}()'), figname=i)
The output images are as follows.
| (Ground Truth) | LDSVR |
SA_BFGS |
SA_IIS |
AA_KNN |
PT_Bayes |
GLLE |
LIBLE |
Enjoy! :)
Experiments
For each algorithm, a ten-fold cross validation is performed, repeated 10 times with s-JAFFE dataset and the average metrics are recorded. Therefore, the results do not fully describe the performance of the model.
Results of ours are as follows.
| Algorithm | Cheby.(↓) | Clark(↓) | Can.(↓) | K-L(↓) | Cos.(↑) | Int.(↑) |
|---|---|---|---|---|---|---|
| SA-BFGS | .092 ± .010 | .361 ± .029 | .735 ± .060 | .051 ± .009 | .954 ± .009 | .878 ± .011 |
| SA-IIS | .100 ± .009 | .361 ± .023 | .746 ± .050 | .051 ± .008 | .952 ± .007 | .873 ± .009 |
| AA-kNN | .098 ± .011 | .349 ± .029 | .716 ± .062 | .053 ± .010 | .950 ± .009 | .877 ± .011 |
| AA-BP | .120 ± .012 | .426 ± .025 | .889 ± .057 | .073 ± .010 | .931 ± .010 | .848 ± .011 |
| PT-Bayes | .116 ± .011 | .425 ± .031 | .874 ± .064 | .073 ± .012 | .932 ± .011 | .850 ± .012 |
| PT-SVM | .117 ± .012 | .422 ± .027 | .875 ± .057 | .072 ± .011 | .932 ± .011 | .850 ± .011 |
Results of the original MATLAB implementation (Geng 2016) are as follows.
| Algorithm | Cheby.(↓) | Clark(↓) | Can.(↓) | K-L(↓) | Cos.(↑) | Int.(↑) |
|---|---|---|---|---|---|---|
| SA-BFGS | .107 ± .015 | .399 ± .044 | .820 ± .103 | .064 ± .016 | .940 ± .015 | .860 ± .019 |
| SA-IIS | .117 ± .015 | .419 ± .034 | .875 ± .086 | .070 ± .012 | .934 ± .012 | .851 ± .016 |
| AA-kNN | .114 ± .017 | .410 ± .050 | .843 ± .113 | .071 ± .023 | .934 ± .018 | .855 ± .021 |
| AA-BP | .130 ± .017 | .510 ± .054 | 1.05 ± .124 | .113 ± .030 | .908 ± .019 | .824 ± .022 |
| PT-Bayes | .121 ± .016 | .430 ± .035 | .904 ± .086 | .074 ± .014 | .930 ± .016 | .846 ± .016 |
| PT-SVM | .127 ± .017 | .457 ± .039 | .935 ± .074 | .086 ± .016 | .920 ± .014 | .839 ± .015 |
Requirements
matplotlib>=3.6.1
numpy>=1.22.3
qpsolvers>=4.0.0
quadprog>=0.1.11
scikit-fuzzy>=0.4.2
scikit-learn>=1.0.2
scipy>=1.8.0
tensorflow>=2.8.0
tensorflow-addons>=0.22.0
tensorflow-probability>=0.16.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file python-ldl-0.0.1.tar.gz.
File metadata
- Download URL: python-ldl-0.0.1.tar.gz
- Upload date:
- Size: 21.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.9.17
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c797b8732164790947e3b010145ab90447bdf282f35142431f9c751909521789
|
|
| MD5 |
7d3c9f273ce7c059046125628be2882e
|
|
| BLAKE2b-256 |
0c8ac612c348fc6f23541d25c65515431066cd8f1d1b90ee634f0113340860c2
|
File details
Details for the file python_ldl-0.0.1-py3-none-any.whl.
File metadata
- Download URL: python_ldl-0.0.1-py3-none-any.whl
- Upload date:
- Size: 21.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.9.17
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b4fcf71127502c9395981561997409979b2f0010eb65bc6aa6ceedee5a299954
|
|
| MD5 |
7b6e05d0507f088ee8ef72ab05da6efb
|
|
| BLAKE2b-256 |
bb5482c663ba31834c0266803b743d62489b45a4d8951d7c34282c5a4738d74d
|