Skip to main content

A modular toolbox for accelerating meta-learning research :rocket:

Project description

logo

A Modular Toolbox for Accelerating Meta-Learning Research :rocket:


PyPI - Python Version PyPI Status Badge Build Status Documentation Status Coverage Status Language grade: Python Code style: black

MetaBlocks is a modular toolbox for research, experimentation, and reproducible benchmarking of learning-to-learn algorithms. The toolbox provides a flexible API for working with MetaDatasets, TaskDistributions, and MetaLearners. The API fully decouples data processing, construction of learning tasks (and their distributions), adaptation algorithms, and model architectures, and makes it easy to experiment with different combinations of these basic building blocks as well as add new components to the growing ecosystem. Additionally, the library features a suite of benchmarks that enable reproducibility. Everything is highly configurable through hydra.

The library is under active development. The latest documentation is available at: https://meta-blocks.readthedocs.io/.


Installation

MetaBlocks requires Python 3.5+ and TensorFlow 2.2+.

For typical use

We recommend using pip for installing the latest release of the library:

$ pip install meta-blocks            # normal install
$ pip install --upgrade meta-blocks  # or update if needed
$ pip install --pre meta-blocks      # or include pre-release version for new features

Alternatively, to install the latest version from the master branch:

$ git clone https://github.com/alshedivat/meta-blocks.git
$ pip install meta-blocks

Note: to be able to access and run benchmarks, you will need to clone the repository.

For development and contributions

You can install additional development requirements as follows:

$ pip install -r requirements/dev.txt

Also, please make sure to install pre-commit hooks to ensure proper code style and formatting:

$ pip install pre-commit      # install pre-commit
$ pre-commit install          # install git hooks
$ pre-commit run --all-files  # run pre-commit on all the files

Getting started & use cases

You can use the library as (1) a modular benchmarking suite or (2) a scaffold API for new learning-to-learn algorithms.

Benchmarking

To enable reproducible research, we maintain a suite of benchmarks/. To run a benchmark, simply clone the repo, change your working directory to the corresponding benchmark, and execute a run script. For example:

$ git clone https://github.com/alshedivat/meta-blocks.git
$ cd meta-blocks/benchmarks/omniglot
$ ./fetch_data                    # fetches data for the benchmark
$ ./run_classic_supervised.sh     # runs an experiment (train and eval routines in parallel)

For more details, please refer to benchmarks/README.md.

MetaBlocks API

MetaBlocks provides multiple layers of API implemented as a hierarchy of Python classes. The three major main components are MetaDataset, TaskDistribution, and MetaLearner:

  1. MetaDataset provides access to Dataset instances constructed from an underlying DataSource. Dataset represents a collection of data tensors (e.g., in case of multi-class classification, it is a collection of input tensors, one for each class).
  2. TaskDistribution further builds on top of MetaDataset and provides access to Task instances that specify the semantics of a learning task. E.g., a few-shot classification task provides access to non-overlapping support and query subsets of a Dataset. Task distributions determine how tasks are sampled and constructed. Currently, we support supervised and self-supervised tasks for few-shot classification.
  3. MetaLearner encapsulates a parametric model (your favorite neural net) and an adaptation algorithm used for adapting the model to new tasks. Adaptation algorithms must use the API exposed by the Task.

Note: decoupling tasks from datasets and (meta-)learning methods is one of the core advantages of meta-blocks over other libraries.

Below are the components currently supported by the library:

Component Supported Instances
MetaDataset Omniglot MiniImageNet ...
TaskDistribution Classic supervised Limited supervised Self-supervised ...
MetaLearner MAML [1] Reptile [2] Prototypical Networks [3] ...

Adding your own meta-datasets

To add your own meta-datasets, you need to subclass MetaDataset and implement a few methods.

[TODO: provide a detailed walk-through example.]

If the full data used to construct the meta-dataset is light and easily fits in the memory, you can follow the implementation of Omniglot. If the dataset is too large or requires some heavy preprocessing, the best way is to use tf.data.Dataset API. As a starting point, you can follow the miniImageNet implementation.

Adding your own meta-learners

To add your own meta-learning algorithms, you need to subclass MetaLearner and implement two methods: _get_adapted_model (must return an adapted model instance) and _build_adaptation (must build a part of the computation graph that adapts the model). Example: prototype-based adaptation builds prototypes from the support set inside _build_adaptation method and returns a model with the corresponding prototypes when _get_adapted_model is called.

[TODO: provide a detailed walk-through example.]


Citation

If you use meta-blocks for research, please cite it as follows.

@misc{metablocks,
  title={MetaBlocks: A modular toolbox for meta-learning research with a focus on speed and reproducibility.},
  year={2020},
  publisher={GitHub},
  journal={GitHub repository},
  howpublished={\url{https://github.com/alshedivat/meta-blocks}},
}

Related projects

A few notable related projects:

Project Description
Torchmeta A PyTorch library that implements multiple few-shot learning methods.
learn2learn A PyTorch library that supports meta-RL.

References

[1] Finn, C., Abbeel, P. and Levine, S. Model-agnostic meta-learning for fast adaptation of deep networks. ICML 2017.

[2] Nichol, A., Achiam, J. and Schulman, J. On first-order meta-learning algorithms. arXiv preprint arXiv:1803.02999.

[3] Snell, J., Swersky, K. and Zemel, R. Prototypical networks for few-shot learning. NeurIPS 2017.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

meta_blocks-0.1.0a3.tar.gz (46.6 kB view details)

Uploaded Source

Built Distribution

meta_blocks-0.1.0a3-py3-none-any.whl (75.7 kB view details)

Uploaded Python 3

File details

Details for the file meta_blocks-0.1.0a3.tar.gz.

File metadata

  • Download URL: meta_blocks-0.1.0a3.tar.gz
  • Upload date:
  • Size: 46.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.7.3

File hashes

Hashes for meta_blocks-0.1.0a3.tar.gz
Algorithm Hash digest
SHA256 ed13655555c22827ffb79a6ab619d288a314e4ca9abf99ea9f3eca592ce1aea4
MD5 516aa1c5dfb7fbc60aabdae2c5265590
BLAKE2b-256 626016b05614a6909459892dc179d211ca31345cccfe1c11155ebd1cbea28bf8

See more details on using hashes here.

File details

Details for the file meta_blocks-0.1.0a3-py3-none-any.whl.

File metadata

  • Download URL: meta_blocks-0.1.0a3-py3-none-any.whl
  • Upload date:
  • Size: 75.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.7.3

File hashes

Hashes for meta_blocks-0.1.0a3-py3-none-any.whl
Algorithm Hash digest
SHA256 94d70e4f3f66ce65a8adadd755bd43521a1801b4825a584dc5dabf5cb4c5fbe1
MD5 e8c3ff53f1a11bb8b46d03af31f30309
BLAKE2b-256 ee23666fa7c60e7035afbf9c08acd0ff9464951c349d6a598ac2a2e412dcfdf8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page