Skip to main content

ABC random forests for model choice and parameter estimation, python wrapper

Project description

PyPI abcranger-build

Random forests methodologies for :

  • ABC model choice [@pudlo2015reliable]
  • ABC Bayesian parameter inference [@raynal2016abc]

Libraries we use :

  • Ranger [@wright2015ranger] : we use our own fork and have tuned forests to do “online”[1] computations (Growing trees AND making predictions in the same pass, which removes the need of in-memory storage of the whole forest)[2].
  • Eigen3 [@eigenweb]

As a mention, we use our own implementation of LDA and PLS from [@friedman2001elements{81, 114}].

There is one set of binaries, which contains a Macos/Linux/Windows (x64 only) binary for each platform. There are available within the “Releases” tab, under “Assets” section (unfold it to see the list).

This is pure command line binary, and they are no prerequisites or library dependencies in order to run it. Just download them and launch them from your terminal software of choice. The usual caveats with command line executable apply there : if you’re not proficient with the command line interface of your platform, please learn some basics or ask someone who might help you in those matters.

The standalone is part of a specialized Population Genetics graphical interface DIYABC-RF, with a (currently under review) submission to MER (Molecular Ecology Resources), [@Collin_2020].

Python

Installation

pip install pyabcranger

Notebooks examples

Usage

 - ABC Random Forest - Model choice or parameter estimation command line options
Usage:
  ../build/abcranger [OPTION...]

  -h, --header arg        Header file (default: headerRF.txt)
  -r, --reftable arg      Reftable file (default: reftableRF.bin)
  -b, --statobs arg       Statobs file (default: statobsRF.txt)
  -o, --output arg        Prefix output (modelchoice_out or estimparam_out by
                          default)
  -n, --nref arg          Number of samples, 0 means all (default: 0)
  -m, --minnodesize arg   Minimal node size. 0 means 1 for classification or
                          5 for regression (default: 0)
  -t, --ntree arg         Number of trees (default: 500)
  -j, --threads arg       Number of threads, 0 means all (default: 0)
  -s, --seed arg          Seed, generated by default (default: 0)
  -c, --noisecolumns arg  Number of noise columns (default: 5)
      --nolinear          Disable LDA for model choice or PLS for parameter
                          estimation
      --plsmaxvar arg     Percentage of maximum explained Y-variance for
                          retaining pls axis (default: 0.9)
      --chosenscen arg    Chosen scenario (mandatory for parameter
                          estimation)
      --noob arg          number of oob testing samples (mandatory for
                          parameter estimation)
      --parameter arg     name of the parameter of interest (mandatory for
                          parameter estimation)
  -g, --groups arg        Groups of models
      --help              Print help
  • If you provide --chosenscen, --parameter and --noob, parameter estimation mode is selected.
  • Otherwise by default it’s model choice mode.
  • Linear additions are LDA for model choice and PLS for parameter estimation, “–nolinear” options disables them in both case.

Model Choice

Terminal model choice

Example

Example :

abcranger -t 10000 -j 8

Header, reftable and statobs files should be in the current directory.

Groups

With the option -g (or --groups), you may “group” your models in several groups splitted . For example if you have six models, labeled from 1 to 6 `-g “1,2,3;4,5,6”

Generated files

Four files are created :

  • modelchoice_out.ooberror : OOB Error rate vs number of trees (line number is the number of trees)
  • modelchoice_out.importance : variables importance (sorted)
  • modelchoice_out.predictions : votes, prediction and posterior error rate
  • modelchoice_out.confusion : OOB Confusion matrix of the classifier

Parameter Estimation

Terminal estim param

Composite parameters

When specifying the parameter (option --parameter), one may specify simple composite parameters as division, addition or multiplication of two existing parameters. like t/N or T1+T2.

A note about PLS heuristic

The --plsmaxvar option (defaulting at 0.90) fixes the number of selected pls axes so that we get at least the specified percentage of maximum explained variance of the output. The explained variance of the output of the m first axes is defined by the R-squared of the output:

Yvar^m = \frac{\sum_{i=1}^{N}{(\hat{y}^{m}_{i}-\bar{y})^2}}{\sum_{i=1}^{N}{(y_{i}-\hat{y})^2}}

where \hat{y}^{m} is the output Y scored by the pls for the mth component. So, only the n_{comp} first axis are kept, and :

n_{comp} = \underset{Yvar^m \leq{} 0.90*Yvar^M, }{\operatorname{argmax}}

Note that if you specify 0 as --plsmaxvar, an “elbow” heuristic is activiated where the following condition is tested for every computed axis :

\frac{Yvar^{k+1}+Yvar^{k}}{2} \geq 0.99(N-k)\left(Yvar^{k+1}-Yvar^ {k}\right)

If this condition is true for a windows of previous axes, sized to 10% of the total possible axis, then we stop the PLS axis computation.

In practice, we find this n_{heur} close enough to the previous n_{comp} for 99%, but it isn’t guaranteed.

The signification of the noob parameter

The median global/local statistics and confidence intervals (global) measures for parameter estimation need a number of OOB samples (--noob) to be reliable (typlially 30% of the size of the dataset is sufficient). Be aware than computing the whole set (i.e. assigning --noob the same than for --nref) for weights predictions [@raynal2016abc] could be very costly, memory and cpu-wise, if your dataset is large in number of samples, so it could be adviseable to compute them for only choose a subset of size noob.

Example (parameter estimation)

Example (working with the dataset in test/data) :

abcranger -t 1000 -j 8 --parameter ra --chosenscen 1 --noob 50

Header, reftable and statobs files should be in the current directory.

Generated files (parameter estimation)

Five files (or seven if pls activated) are created :

  • estimparam_out.ooberror : OOB MSE rate vs number of trees (line number is the number of trees)
  • estimparam_out.importance : variables importance (sorted)
  • estimparam_out.predictions : expectation, variance and 0.05, 0.5, 0.95 quantile for prediction
  • estimparam_out.predweights : csv of the value/weights pairs of the prediction (for density plot)
  • estimparam_out.oobstats : various statistics on oob (MSE, NMSE, NMAE etc.)

if pls enabled :

  • estimparam_out.plsvar : variance explained by number of components
  • estimparam_out.plsweights : variable weight in the first component (sorted by absolute value)

TODO

Input/Output

  • Integrate hdf5 (or exdir? msgpack?) routines to save/load reftables/observed stats with associated metadata
  • Provide R code to save/load the data
  • Provide Python code to save/load the data

C++ standalone

  • Merge the two methodologies in a single executable with the (almost) the same options
  • (Optional) Possibly move to another options parser (CLI?)

External interfaces

  • R package
  • Python package

Documentation

  • Code documentation
  • Document the build

Continuous integration

  • Fix travis build. Currently the vcpkg download of eigen3 head is broken.
  • osX travis build
  • Appveyor win32 build

Long/Mid term TODO

  • methodologies parameters auto-tuning
    • auto-discovering the optimal number of trees by monitoring OOB error
    • auto-limiting number of threads by available memory
  • Streamline the two methodologies (model choice and then parameters estimation)
  • Write our own tree/rf implementation with better storage efficiency than ranger
  • Make functional tests for the two methodologies
  • Possible to use mondrian forests for online batches ? See [@lakshminarayanan2014mondrian]

References

This have been the subject of a proceedings in JOBIM 2020, PDF and video (in french), [@collin:hal-02910067].

[1] The term “online” there and in the code has not the usual meaning it has, as coined in “online machine learning”. We still need the entire training data set at once. Our implementation is an “online” one not by the sequential order of the input data, but by the sequential order of computation of the trees in random forests, sequentially computed and then discarded.

[2] We only use the C++ Core of ranger, which is under MIT License, same as ours.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyabcranger-0.0.45.tar.gz (50.4 kB view details)

Uploaded Source

Built Distributions

pyabcranger-0.0.45-cp38-cp38-win_amd64.whl (617.0 kB view details)

Uploaded CPython 3.8 Windows x86-64

pyabcranger-0.0.45-cp37-cp37m-win_amd64.whl (617.6 kB view details)

Uploaded CPython 3.7m Windows x86-64

pyabcranger-0.0.45-cp37-cp37m-macosx_10_9_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.7m macOS 10.9+ x86-64

pyabcranger-0.0.45-cp36-cp36m-win_amd64.whl (617.6 kB view details)

Uploaded CPython 3.6m Windows x86-64

pyabcranger-0.0.45-cp36-cp36m-macosx_10_9_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.6m macOS 10.9+ x86-64

File details

Details for the file pyabcranger-0.0.45.tar.gz.

File metadata

  • Download URL: pyabcranger-0.0.45.tar.gz
  • Upload date:
  • Size: 50.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.8.6

File hashes

Hashes for pyabcranger-0.0.45.tar.gz
Algorithm Hash digest
SHA256 982462fb106a18e8596387803df52b6832131c8034f4477d87390933a7dbd70d
MD5 4f8bff4790635acfa370a77d78b69770
BLAKE2b-256 70229df56aed54f4d81acd390a88c059d5a86628282482b034d30c355b8aafb3

See more details on using hashes here.

File details

Details for the file pyabcranger-0.0.45-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: pyabcranger-0.0.45-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 617.0 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.8.6

File hashes

Hashes for pyabcranger-0.0.45-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 8c4a63e83321c5f624b3554c913a3dcedb7bff9a64ac2d2af29b0b9f403dd002
MD5 0100c7817c2aaac8ac1d844628be1aec
BLAKE2b-256 95321787caffc5fb76e15124071a3dbe2db372b6d62fec912dc5e30c2e51d4ae

See more details on using hashes here.

File details

Details for the file pyabcranger-0.0.45-cp38-cp38-manylinux2014_x86_64.whl.

File metadata

  • Download URL: pyabcranger-0.0.45-cp38-cp38-manylinux2014_x86_64.whl
  • Upload date:
  • Size: 3.4 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.8.6

File hashes

Hashes for pyabcranger-0.0.45-cp38-cp38-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 cd00a8e9b8012b2df082d7ab87b94a4885892d102d85c7d2f0164f259d072b38
MD5 2871028373f5f36889f2128fb0918101
BLAKE2b-256 f88e23a67427a0723834325d7b34b2cfd583c1390ec5ce39acbe5a14f78512bc

See more details on using hashes here.

File details

Details for the file pyabcranger-0.0.45-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: pyabcranger-0.0.45-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 617.6 kB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.8.6

File hashes

Hashes for pyabcranger-0.0.45-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 da1b6cf90504ea359ac6bfca796b92b40ac91475e5656ea38deadb8782380ae7
MD5 9a52e2d29ec0725532a748c0beda325b
BLAKE2b-256 d74c490ab2334fc3f5ebc53b7ff46ff0f8fb449da0110a830d84176db1bc19a8

See more details on using hashes here.

File details

Details for the file pyabcranger-0.0.45-cp37-cp37m-manylinux2014_x86_64.whl.

File metadata

  • Download URL: pyabcranger-0.0.45-cp37-cp37m-manylinux2014_x86_64.whl
  • Upload date:
  • Size: 3.4 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.8.6

File hashes

Hashes for pyabcranger-0.0.45-cp37-cp37m-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 b31b987799c416fef61e99b61bb011b1117c78fb04a1c46af73339da081ac6d0
MD5 5d5537e84bb027b81380c737e219ced2
BLAKE2b-256 6d6014280d8b1f031321f11930a06d958b8ff1a4acc6e051fd23bd301a293b89

See more details on using hashes here.

File details

Details for the file pyabcranger-0.0.45-cp37-cp37m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: pyabcranger-0.0.45-cp37-cp37m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.7m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.8.6

File hashes

Hashes for pyabcranger-0.0.45-cp37-cp37m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 aa980b887bc88d4d24b4535bba1b81bb6952421bb7305b2df5a57a2160d683c5
MD5 0f55d9479b1aabe91c4fba80f4f8e0b0
BLAKE2b-256 7e9fc7b2999027bf9ce9045f4b75c325478a647981984f919cde5bd0df961c32

See more details on using hashes here.

File details

Details for the file pyabcranger-0.0.45-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: pyabcranger-0.0.45-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 617.6 kB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.8.6

File hashes

Hashes for pyabcranger-0.0.45-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 b8d2c99e139c68dd848b035da0156fb43791ffef36ce63a5a7b09c792542a456
MD5 a0dcaa9b31206a4a2ff4cc916d039a3d
BLAKE2b-256 6fbb8b4f78817e76f964d390da6f022fc51e5940baee5a4a9064ee9307efecd9

See more details on using hashes here.

File details

Details for the file pyabcranger-0.0.45-cp36-cp36m-manylinux2014_x86_64.whl.

File metadata

  • Download URL: pyabcranger-0.0.45-cp36-cp36m-manylinux2014_x86_64.whl
  • Upload date:
  • Size: 3.4 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.8.6

File hashes

Hashes for pyabcranger-0.0.45-cp36-cp36m-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 a1753f7a43d2a69b9b6cba7b6310ae1c3e700174c8d7de7a902ca78ad01689f0
MD5 3576f65f456480faae7e36ca4e6de541
BLAKE2b-256 c578faef0e58e28e499cd2e2a843bab360a3e201626bf2602654f8c51b6859e1

See more details on using hashes here.

File details

Details for the file pyabcranger-0.0.45-cp36-cp36m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: pyabcranger-0.0.45-cp36-cp36m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: CPython 3.6m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.8.6

File hashes

Hashes for pyabcranger-0.0.45-cp36-cp36m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 e6f58b53393da7de3407290c458f6ea2af93373ff98663cee8adc83af53e0f37
MD5 a92610701f7295a3cb03464ea428f7f5
BLAKE2b-256 caf682656262053c88e76c5ee5e481b9f4b84b48e0c8aeb3f4fd4b4fac37ffbe

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page