Skip to main content

Interface for using interfeature TabPFN and library to train TabPFN'

Project description

TabPFN

The TabPFN is a neural network that learned to do tabular data prediction. This is the original CUDA-supporting pytorch impelementation.

We created a Colab, that lets you play with our scikit-learn interface.

We also created two demos. One to experiment with the TabPFNs predictions (https://huggingface.co/spaces/TabPFN/TabPFNPrediction) and one to check cross- validation ROC AUC scores on new datasets (https://huggingface.co/spaces/TabPFN/TabPFNEvaluation). Both of them run on a weak CPU, thus it can require a little bit of time. Both demos are based on a scikit-learn interface that makes using the TabPFN as easy as a scikit-learn SVM.

Installation

pip install tabpfn

If you want to train and evaluate our method like we did in the paper (including baselines) please install with

pip install tabpfn[full]

To run the autogluon and autosklearn baseline please create a separate environment and install autosklearn / autogluon==0.4.0, installation in the same environment as our other baselines is not possible.

Getting started

A simple usage of our sklearn interface is:

from sklearn.metrics import accuracy_score
from sklearn.datasets import load_breast_cancer
from sklearn.model_selection import train_test_split

from tabpfn import TabPFNClassifier

X, y = load_breast_cancer(return_X_y=True)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.33, random_state=42)

# N_ensemble_configurations controls the number of model predictions that are ensembled with feature and class rotations (See our work for details).
# When N_ensemble_configurations > #features * #classes, no further averaging is applied.

classifier = TabPFNClassifier(device='cpu', N_ensemble_configurations=32)

classifier.fit(X_train, y_train)
y_eval, p_eval = classifier.predict(X_test, return_winning_probability=True)

print('Accuracy', accuracy_score(y_test, y_eval))

TabPFN Usage

TabPFN is different from other methods you might know for tabular classification. Here, we list some tips and tricks that might help you understand how to use it best.

  • Do not preprocess inputs to TabPFN. TabPFN pre-processes inputs internally. It applies a z-score normalization (x-train_x.mean()/train_x.std()) per feature (fitted on the training set) and log-scales outliers heuristically. Finally, TabPFN applies a PowerTransform to all features for every second ensemble member. Pre-processing is important for the TabPFN to make sure that the real-world dataset lies in the distribution of the synthetic datasets seen during training. So to get the best results, do not apply a PowerTransformation to the inputs.
  • TabPFN expects scalar values only (you need to encode categoricals as integers e.g. with OrdinalEncoder). It works best on data that does not contain any categorical or NaN data (see Appendix B.1).
  • TabPFN ensembles multiple input encodings per default. It feeds different index rotations of the features and labels to the model per ensemble member. You can control the ensembling with TabPFNClassifier(...,N_ensemble_configurations=?)
  • TabPFN does not use any statistics from the test set. That means predicting each test example one-by-one will yield the same result as feeding the whole test set together.
  • TabPFN is differentiable in principle, only the pre-processing is not and relies on numpy.

Our Paper

Read our paper for more information about the setup (or contact us ☺️). If you use our method, please cite us using

@misc{tabpfn,
  doi = {10.48550/ARXIV.2207.01848},
  url = {https://arxiv.org/abs/2207.01848},
  author = {Hollmann, Noah and Müller, Samuel and Eggensperger, Katharina and Hutter, Frank},
  keywords = {Machine Learning (cs.LG), Machine Learning (stat.ML), FOS: Computer and information sciences, FOS: Computer and information sciences},
  title = {TabPFN: A Transformer That Solves Small Tabular Classification Problems in a Second},
  publisher = {arXiv},
  year = {2022},
  copyright = {arXiv.org perpetual, non-exclusive license}
}

License

Copyright 2022 Noah Hollmann, Samuel Müller, Katharina Eggensperger, Frank Hutter

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at

http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

itabpfn-0.0.16.tar.gz (150.9 kB view details)

Uploaded Source

Built Distribution

itabpfn-0.0.16-py3-none-any.whl (101.3 kB view details)

Uploaded Python 3

File details

Details for the file itabpfn-0.0.16.tar.gz.

File metadata

  • Download URL: itabpfn-0.0.16.tar.gz
  • Upload date:
  • Size: 150.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.8

File hashes

Hashes for itabpfn-0.0.16.tar.gz
Algorithm Hash digest
SHA256 46a18627a00f21da8dfce47dee8e45323419eecfcb42669d5a0e53b9849ced02
MD5 46a878552bb4f4ea6b6e3d2f9f29f3e7
BLAKE2b-256 43d37dc14fdcda875e80a36fe312ad084837938fb2d847cc0c182d85e1aac7ed

See more details on using hashes here.

File details

Details for the file itabpfn-0.0.16-py3-none-any.whl.

File metadata

  • Download URL: itabpfn-0.0.16-py3-none-any.whl
  • Upload date:
  • Size: 101.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.8

File hashes

Hashes for itabpfn-0.0.16-py3-none-any.whl
Algorithm Hash digest
SHA256 d4215d0a85dab5e4e27f1f0e31a83cb036a6db5b5233e341894c23002be4ab00
MD5 d12eeb22773829d78ebe3d13a8b87fe4
BLAKE2b-256 b93ce8668bb9fa248edb38069a6f9214946331de6a742fe66971916fc8bff5be

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page