Skip to main content

PFNs made ready for BO

Project description

PFNs

Prior-data Fitted Networks (PFNs, https://arxiv.org/abs/2112.10510) are transformer-based models trained to approximate Bayesian prediction. They are trained to do this via supervised in-context learning on datasets randomly drawn from a prior. Our priors can in general be described by a function that samples datasets, or more generally a batch of datasets. The PFN is then trained to predict a hold-out set of labels, given the rest of the dataset.

The pseudo code for a simple prior that would yield a PFN that does 1d ridge regression on datasets with 100 elements, could be something like this:

def get_dataset_sample():
    x = RandomUniform(100,1)
    a = RandomNormal()
    b = RandomNormal()
    y = a * x + b
    return x, y

Check out our tutorial to train your own ridge regression PFN.

Install with pip

This way of installing allows you to use the package everywhere and still be able to edit files. You should use a pytorch compatible python version (oftentimes they don't support the latest version).

git clone https://github.com/automl/PFNs.git
cd PFNs
pip install -e .

Developing

We use a CI, the parts of which you can run before locally:

  1. Tests: To run tests simply use pytest tests.
  2. Formatting: Use pre-commit (install with pip install pre-commit, then pre-commit install) and run manually it with pre-commit run --all-files --show-diff-on-failure

Get Started

Check out our Getting Started Colab.

Running actual, proper trainings from the command-line

We have a cli, which is documented here.

What is in this package?

  • Code to train models with a variety of priors
  • The feature-wise architecture from TabPFNv2 as well as the traditional PFN architecture
  • A lot of normalizers to encode features well
  • Some code to run Bayesian Optimization experiments

BO

There is a BO version of this repo, with pretrained models at github.com/automl/PFNs4BO. The two repos share a lot of the code, but the other is not anymore actively maintained. You can also train your own models with our tutorial notebook here.

To run all BayesOpt experiments, please install this package with the benchmarks option:

pip install -e .[benchmarks]

Bayes' Power for Explaining In-Context Learning Generalizations

This repository is frozen at the state of the submission all funcionality is copied to the actively maintained repository github.com/automl/PFNs

This repository contains the code for the paper "Bayes' Power for Explaining In-Context Learning Generalizations".

Install in editable mode:

pip install -e .

We have a set of notebooks in this repository to reproduce the results of our paper.

  • To reproduce the main ICL experiments, use the notebook discrete_bayes.ipynb.
  • To run the Tiny-MLP generalization experiments, where we evaluate extrapolation, use the notebook Tiny_MLP_Generalization.ipynb.
  • To run the Coin-Flipping experiments, where we show that the true posterior converges to the wrong probability, use the notebook Cointhrowing_converging_to_wrong_posterior.ipynb.
  • To see the GP converging to the wrong solution for a step function, use the notebook GP_fitting_a_step.ipynb.

Cite the work

PFNs were introduced in

@inproceedings{
    muller2022transformers,
    title={Transformers Can Do Bayesian Inference},
    author={Samuel M{\"u}ller and Noah Hollmann and Sebastian Pineda Arango and Josif Grabocka and Frank Hutter},
    booktitle={International Conference on Learning Representations},
    year={2022},
    url={https://openreview.net/forum?id=KSugKcbNf9}
}

Training PFNs on tabular data (TabPFN) was enhanced in

@inproceedings{
  hollmann2023tabpfn,
  title={Tab{PFN}: A Transformer That Solves Small Tabular Classification Problems in a Second},
  author={Noah Hollmann and Samuel M{\"u}ller and Katharina Eggensperger and Frank Hutter},
  booktitle={The Eleventh International Conference on Learning Representations},
  year={2023},
  url={https://openreview.net/forum?id=cp5PvcI6w8_}
}

The BO version of PFNs was introduced in

@article{muller2023pfns,
  title={PFNs4BO: In-Context Learning for Bayesian Optimization},
  author={M{\"u}ller, Samuel and Feurer, Matthias and Hollmann, Noah and Hutter, Frank},
  journal={arXiv preprint arXiv:2305.17535},
  year={2023}
}

The "Bayes' Power for Explaining In-Context Learning Generalizations" is

@article{muller2024bayes,
  title={Bayes' Power for Explaining In-Context Learning Generalizations},
  author={M{\"u}ller, Samuel and Hollmann, Noah and Hutter, Frank},
  journal={arXiv preprint arXiv:2410.01565},
  year={2024}
}

The new architecture, which we support via config.model.features_per_group = <some small positive int, like 1> + config.model.attention_between_features = True.

@article{hollmann2025accurate,
  title={Accurate predictions on small data with a tabular foundation model},
  author={Hollmann, Noah and M{\"u}ller, Samuel and Purucker, Lennart and Krishnakumar, Arjun and K{\"o}rfer, Max and Hoo, Shi Bin and Schirrmeister, Robin Tibor and Hutter, Frank},
  journal={Nature},
  volume={637},
  number={8045},
  pages={319--326},
  year={2025},
  publisher={Nature Publishing Group UK London}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pfns-0.4.2.tar.gz (9.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pfns-0.4.2-py3-none-any.whl (154.3 kB view details)

Uploaded Python 3

File details

Details for the file pfns-0.4.2.tar.gz.

File metadata

  • Download URL: pfns-0.4.2.tar.gz
  • Upload date:
  • Size: 9.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for pfns-0.4.2.tar.gz
Algorithm Hash digest
SHA256 8f17f0124718f6e5627bc47ec42cf502fb0ca0e9ef2d3ea8ec2c11cb2b0275c8
MD5 beae93cbf8f61a47fbd1274244ce2475
BLAKE2b-256 96b73e29e7368bc2d8a9188b37194eacc9a3fdc3636cce580602934a3dbcf8c5

See more details on using hashes here.

File details

Details for the file pfns-0.4.2-py3-none-any.whl.

File metadata

  • Download URL: pfns-0.4.2-py3-none-any.whl
  • Upload date:
  • Size: 154.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.2

File hashes

Hashes for pfns-0.4.2-py3-none-any.whl
Algorithm Hash digest
SHA256 f5a17b3e3b14863a7d66c124e90a8d4559c510720158e5e1690d7735867e36cd
MD5 2751d56f9c968199e3715a1ca801ab72
BLAKE2b-256 bfb64873e86484556927b6a0bd6682ca1a2b26b670bc8e526ab60561c29faba5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page