Skip to main content

A benchmark of progressively more difficult AI tasks to measure learning speed of ML systems

Project description

Incremental tasks

PyPI Version Incremental tasks CI

This is a modular and extendable benchmark of progressively more difficult AI tasks to measure learning speed of ML systems.

This repository contains the code to generate the incremental task dataset used in [1].

Installation

This package can also be used as a library. Just install it from PyPI (ideally in a virtual env if you don't want the CLI command to pollute your path).

pip install incremental_tasks

This installs the library as well as an executable generate_tasks_cli

Task generation

The command generate_tasks_cli can be used to directly generate sequences from the command line. They are printed to stdout and can be saved to a file to quickly create a dataset.

Interactive task solving

A user can try the tasks by himself by running generate_tasks_cli. This will start an interactive session that will show random examples from the tasks of the benchmarks, starting from the easiest.

Once a task is solved, it switches to a new harder one.

An example interactive session:

$ generate_tasks_cli  --interactive

======================================================================
0 1 1 0 0 0 1 1 0 1 1 0 0 0 1 1 0 1 1 {?} {?} {?} {?} {?}
Type you answers (space separated) 0 0 0 1 1
OK!
0 1 1 0 0 0 1 1 0 1 1 0 0 0 1 1 0 1 1 0 0 0 1 1

======================================================================
1 0 0 0 1 0 0 0 {?} {?} {?} {?} {?}
Type you answers (space separated) 0 1 1 1 0
Wrong! right answer was:
1 0 0 0 1 0 0 0 1 0 0 0 1

In [1] the human evaluation score were computed using this interactive game with the extra flag --human-eval which maps every token to a random one so the player doesn't have any prior knowledge about the text and needs to do pattern matching like a neural network would.

Library

You can use the library in your own code to generate the data on the fly:

from incremental_tasks import ElementaryLanguageWithWorldDef

task = ElementaryLanguageWithWorldDef()

To generate a single sentence from the task use generate_single:

print(task.generate_single())
# This will print (['I', 'DO', 'NOT', 'SMELL', 'PETER', '.', 'DO', 'I', 'SMELL', 'PETER', '?', 'NO'], [11])

To generate n unique sequences (will be less than n if there aren't enough available unique sequences):

task.generate_tasks(max_n_seq=n)

A task can also create a generator that will yield an endless stream of sequences (not necessarily unique):

task.generate_tasks_generator(max_n_seq=None)

References

  • [1] Cisneros, H., Mikolov, T., & Sivic, J. (2022). Benchmarking Learning Efficiency in Deep Reservoir Computing. 1st Conference on Lifelong Learning Agents, Montreal, Canada.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

incremental_tasks-0.1.3.tar.gz (15.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

incremental_tasks-0.1.3-py3-none-any.whl (16.7 kB view details)

Uploaded Python 3

File details

Details for the file incremental_tasks-0.1.3.tar.gz.

File metadata

  • Download URL: incremental_tasks-0.1.3.tar.gz
  • Upload date:
  • Size: 15.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.0b2 CPython/3.9.9 Darwin/21.5.0

File hashes

Hashes for incremental_tasks-0.1.3.tar.gz
Algorithm Hash digest
SHA256 4458a113a1afdf592f46bd672ca163ad373e4ce9eb8319be5db70372c1f8bc46
MD5 884311ffd2a773234bf0c11edb977d2c
BLAKE2b-256 41cefb879524c20abecea8938c99338fb042590c65849be67c83b4f0c221a705

See more details on using hashes here.

File details

Details for the file incremental_tasks-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: incremental_tasks-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 16.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.2.0b2 CPython/3.9.9 Darwin/21.5.0

File hashes

Hashes for incremental_tasks-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 9bd914fc316a7008ec54acd191ede99b828f06ae77919cc33ec65cf4471f9971
MD5 2ea6f01a84ef171388f7c00f69cc5559
BLAKE2b-256 36d5b923a542b8aa4dc09f2f249764084409bfca4126c5c15b4f5826f63925a8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page