Skip to main content

ABC random forests for model choice and parameter estimation, python wrapper

Project description

PyPI abcranger-build

Random forests methodologies for :

Libraries we use :

As a mention, we use our own implementation of LDA and PLS from (Friedman, Hastie, and Tibshirani 2001, 1:81, 114).

There is one set of binaries, which contains a Macos/Linux/Windows (x64 only) binary for each platform. There are available within the “Releases” tab, under “Assets” section (unfold it to see the list).

This is pure command line binary, and they are no prerequisites or library dependencies in order to run it. Just download them and launch them from your terminal software of choice. The usual caveats with command line executable apply there : if you’re not proficient with the command line interface of your platform, please learn some basics or ask someone who might help you in those matters.

The standalone is part of a specialized Population Genetics graphical interface DIYABC-RF, with a (currently under review) submission to MER (Molecular Ecology Resources), (Collin, Durif, et al. 2020).

Python

Installation

pip install pyabcranger

Notebooks examples

Usage

 - ABC Random Forest - Model choice or parameter estimation command line options
Usage:
  ../build/abcranger [OPTION...]

  -h, --header arg        Header file (default: headerRF.txt)
  -r, --reftable arg      Reftable file (default: reftableRF.bin)
  -b, --statobs arg       Statobs file (default: statobsRF.txt)
  -o, --output arg        Prefix output (modelchoice_out or estimparam_out by
                          default)
  -n, --nref arg          Number of samples, 0 means all (default: 0)
  -m, --minnodesize arg   Minimal node size. 0 means 1 for classification or
                          5 for regression (default: 0)
  -t, --ntree arg         Number of trees (default: 500)
  -j, --threads arg       Number of threads, 0 means all (default: 0)
  -s, --seed arg          Seed, generated by default (default: 0)
  -c, --noisecolumns arg  Number of noise columns (default: 5)
      --nolinear          Disable LDA for model choice or PLS for parameter
                          estimation
      --plsmaxvar arg     Percentage of maximum explained Y-variance for
                          retaining pls axis (default: 0.9)
      --chosenscen arg    Chosen scenario (mandatory for parameter
                          estimation)
      --noob arg          number of oob testing samples (mandatory for
                          parameter estimation)
      --parameter arg     name of the parameter of interest (mandatory for
                          parameter estimation)
  -g, --groups arg        Groups of models
      --help              Print help
  • If you provide --chosenscen, --parameter and --noob, parameter estimation mode is selected.
  • Otherwise by default it’s model choice mode.
  • Linear additions are LDA for model choice and PLS for parameter estimation, “–nolinear” options disables them in both case.

Model Choice

Terminal model choice

Example

Example :

abcranger -t 10000 -j 8

Header, reftable and statobs files should be in the current directory.

Groups

With the option -g (or --groups), you may “group” your models in several groups splitted . For example if you have six models, labeled from 1 to 6 `-g “1,2,3;4,5,6”

Generated files

Four files are created :

  • modelchoice_out.ooberror : OOB Error rate vs number of trees (line number is the number of trees)
  • modelchoice_out.importance : variables importance (sorted)
  • modelchoice_out.predictions : votes, prediction and posterior error rate
  • modelchoice_out.confusion : OOB Confusion matrix of the classifier

Parameter Estimation

Terminal estim param

Composite parameters

When specifying the parameter (option --parameter), one may specify simple composite parameters as division, addition or multiplication of two existing parameters. like t/N or T1+T2.

A note about PLS heuristic

The --plsmaxvar option (defaulting at 0.90) fixes the number of selected pls axes so that we get at least the specified percentage of maximum explained variance of the output. The explained variance of the output of the m first axes is defined by the R-squared of the output:

Yvar^m = \frac{\sum_{i=1}^{N}{(\hat{y}^{m}_{i}-\bar{y})^2}}{\sum_{i=1}^{N}{(y_{i}-\hat{y})^2}}

where \hat{y}^{m} is the output Y scored by the pls for the mth component. So, only the n_{comp} first axis are kept, and :

n_{comp} = \underset{Yvar^m \leq{} 0.90*Yvar^M, }{\operatorname{argmax}}

Note that if you specify 0 as --plsmaxvar, an “elbow” heuristic is activiated where the following condition is tested for every computed axis :

\frac{Yvar^{k+1}+Yvar^{k}}{2} \geq 0.99(N-k)\left(Yvar^{k+1}-Yvar^ {k}\right)

If this condition is true for a windows of previous axes, sized to 10% of the total possible axis, then we stop the PLS axis computation.

In practice, we find this n_{heur} close enough to the previous n_{comp} for 99%, but it isn’t guaranteed.

The signification of the noob parameter

The median global/local statistics and confidence intervals (global) measures for parameter estimation need a number of OOB samples (--noob) to be reliable (typlially 30% of the size of the dataset is sufficient). Be aware than computing the whole set (i.e. assigning --noob the same than for --nref) for weights predictions (Raynal et al. 2018) could be very costly, memory and cpu-wise, if your dataset is large in number of samples, so it could be adviseable to compute them for only choose a subset of size noob.

Example (parameter estimation)

Example (working with the dataset in test/data) :

abcranger -t 1000 -j 8 --parameter ra --chosenscen 1 --noob 50

Header, reftable and statobs files should be in the current directory.

Generated files (parameter estimation)

Five files (or seven if pls activated) are created :

  • estimparam_out.ooberror : OOB MSE rate vs number of trees (line number is the number of trees)
  • estimparam_out.importance : variables importance (sorted)
  • estimparam_out.predictions : expectation, variance and 0.05, 0.5, 0.95 quantile for prediction
  • estimparam_out.predweights : csv of the value/weights pairs of the prediction (for density plot)
  • estimparam_out.oobstats : various statistics on oob (MSE, NMSE, NMAE etc.)

if pls enabled :

  • estimparam_out.plsvar : variance explained by number of components
  • estimparam_out.plsweights : variable weight in the first component (sorted by absolute value)

TODO

Input/Output

  • Integrate hdf5 (or exdir? msgpack?) routines to save/load reftables/observed stats with associated metadata
  • Provide R code to save/load the data
  • Provide Python code to save/load the data

C++ standalone

  • Merge the two methodologies in a single executable with the (almost) the same options
  • (Optional) Possibly move to another options parser (CLI?)

External interfaces

  • R package
  • Python package

Documentation

  • Code documentation
  • Document the build

Continuous integration

  • Fix travis build. Currently the vcpkg download of eigen3 head is broken.
  • osX travis build
  • Appveyor win32 build

Long/Mid term TODO

  • methodologies parameters auto-tuning
    • auto-discovering the optimal number of trees by monitoring OOB error
    • auto-limiting number of threads by available memory
  • Streamline the two methodologies (model choice and then parameters estimation)
  • Write our own tree/rf implementation with better storage efficiency than ranger
  • Make functional tests for the two methodologies
  • Possible to use mondrian forests for online batches ? See (Lakshminarayanan, Roy, and Teh 2014)

References

This have been the subject of a proceedings in JOBIM 2020, PDF and video (in french), (Collin, Estoup, et al. 2020).

Collin, François-David, Ghislain Durif, Louis Raynal, Eric Lombaert, Mathieu Gautier, Renaud Vitalis, Jean Michel Marin, and Arnaud Estoup. 2020. “Extending Approximate Bayesian Computation with Supervised Machine Learning to Infer Demographic History from Genetic Polymorphisms Using DIYABC Random Forest,” July. https://doi.org/10.22541/au.159480722.26357192.

Collin, François-David, Arnaud Estoup, Jean-Michel Marin, and Louis Raynal. 2020. “Bringing ABC inference to the machine learning realm : AbcRanger, an optimized random forests library for ABC.” In JOBIM 2020, 2020:66. JOBIM. Montpellier, France. https://hal.archives-ouvertes.fr/hal-02910067.

Friedman, Jerome, Trevor Hastie, and Robert Tibshirani. 2001. The Elements of Statistical Learning. Vol. 1. 10. Springer series in statistics New York, NY, USA:

Guennebaud, Gaël, Benoît Jacob, and others. 2010. “Eigen V3.” http://eigen.tuxfamily.org.

Lakshminarayanan, Balaji, Daniel M Roy, and Yee Whye Teh. 2014. “Mondrian Forests: Efficient Online Random Forests.” In Advances in Neural Information Processing Systems, 3140–48.

Pudlo, Pierre, Jean-Michel Marin, Arnaud Estoup, Jean-Marie Cornuet, Mathieu Gautier, and Christian P Robert. 2015. “Reliable ABC Model Choice via Random Forests.” Bioinformatics 32 (6): 859–66.

Raynal, Louis, Jean-Michel Marin, Pierre Pudlo, Mathieu Ribatet, Christian P Robert, and Arnaud Estoup. 2018. “ABC random forests for Bayesian parameter inference.” Bioinformatics 35 (10): 1720–28. https://doi.org/10.1093/bioinformatics/bty867.

Wright, Marvin N, and Andreas Ziegler. 2015. “Ranger: A Fast Implementation of Random Forests for High Dimensional Data in c++ and r.” arXiv Preprint arXiv:1508.04409.

[1] The term “online” there and in the code has not the usual meaning it has, as coined in “online machine learning.” We still need the entire training data set at once. Our implementation is an “online” one not by the sequential order of the input data, but by the sequential order of computation of the trees in random forests, sequentially computed and then discarded.

[2] We only use the C++ Core of ranger, which is under MIT License, same as ours.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

pyabcranger-0.0.55-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

pyabcranger-0.0.55-cp310-cp310-macosx_10_9_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.10 macOS 10.9+ x86-64

pyabcranger-0.0.55-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

pyabcranger-0.0.55-cp39-cp39-macosx_10_9_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.9 macOS 10.9+ x86-64

pyabcranger-0.0.55-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.6 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

pyabcranger-0.0.55-cp38-cp38-macosx_10_9_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.8 macOS 10.9+ x86-64

pyabcranger-0.0.55-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.7 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

pyabcranger-0.0.55-cp37-cp37m-macosx_10_9_x86_64.whl (1.1 MB view details)

Uploaded CPython 3.7m macOS 10.9+ x86-64

File details

Details for the file pyabcranger-0.0.55-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for pyabcranger-0.0.55-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 55bacb9ee8ea92a2961bbe2d935c55aa01ba0b32defff206986c13c73b8472dd
MD5 da8a0b20f8eb49e49895dc4ea9a03138
BLAKE2b-256 899fe56ae3802fb025fd0ee168e14b65933fdbb766f11bbd6fd985d90c19f162

See more details on using hashes here.

File details

Details for the file pyabcranger-0.0.55-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for pyabcranger-0.0.55-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 a0ba01bedf8844e544a14ec4cab1aa29387f6ece1deb876ed60ba682725831a4
MD5 f148d597614ea19cb765656c3dd84dc2
BLAKE2b-256 65d981e5915c9d940c67715e5bcd622cddcb5cef37ed260e02c6a074b8cc5e34

See more details on using hashes here.

File details

Details for the file pyabcranger-0.0.55-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for pyabcranger-0.0.55-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 a2ce120935a94b2f2e6d36eba66c91ff51f877ac8c4da18747c0d1657ec4ad0b
MD5 4f75e2a6394a931ab08e3242c859440d
BLAKE2b-256 9c7d4b1bae63ee074e336fc900c032c1e9dc174136d894c935b41f2c8e51f7b4

See more details on using hashes here.

File details

Details for the file pyabcranger-0.0.55-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for pyabcranger-0.0.55-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 1bd47147eed2541c9527e8b2b9a175ba4cae9f67487d08cf8b4da470ac6aa196
MD5 aec86783e8a721cbf8d5aa90ee1f891e
BLAKE2b-256 dba116261eb9542c6e5a1fb6fa5aaf258c21a2a75d2137cd65720abee7e827b6

See more details on using hashes here.

File details

Details for the file pyabcranger-0.0.55-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for pyabcranger-0.0.55-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 5ae8ad1520b4ed8f4f1df68fdef076159265993e9ae6c039b96faff4d3e28eca
MD5 a04ed7165bdb56ea3f7bff74e18693c3
BLAKE2b-256 f5e0aaab241f32fab2f1552b4fb29cff324a68ee326f757733215f785d378dfe

See more details on using hashes here.

File details

Details for the file pyabcranger-0.0.55-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for pyabcranger-0.0.55-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 dd0fae9c424a8ad9b10f5f2f288f6dbc25a5a4292a1bd02ec86389ffb13c6cb3
MD5 7ffde6496ea2e975f99465f6a1ab6bcd
BLAKE2b-256 d7c8e5e7812125bf692baf2417142457917853005b61664bdc2cfaa34131f39d

See more details on using hashes here.

File details

Details for the file pyabcranger-0.0.55-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for pyabcranger-0.0.55-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 cdf428e8abc3ba3e31f161c0fec217193b81f24d216535b1ae9e388a066dead2
MD5 5321f629adc745ed5840d8a229802c68
BLAKE2b-256 928de0f8830a7577f9826a172b235b5d147bfd0ede98e75aedd50507dacf6b42

See more details on using hashes here.

File details

Details for the file pyabcranger-0.0.55-cp37-cp37m-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for pyabcranger-0.0.55-cp37-cp37m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 5571e02672f7f53d093fb9559209ac64fbb21be4bd237caf8ba190c095563814
MD5 98bbce75717ec076717fe71d987988c2
BLAKE2b-256 f1927e4c8df9201c487db8397f120e94bbed3c0686c0e6da260f0a601b38c1ca

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page