Skip to main content

This package contains several methods for calculating Conditional Average Treatment Effects

Project description

Build Status PyPI version PyPI wheel Supported Python versions

EconML: A Python Package for ML-Based Heterogeneous Treatment Effects Estimation

EconML is a Python package for estimating heterogeneous treatment effects from observational data via machine learning. This package was designed and built as part of the ALICE project at Microsoft Research with the goal to combine state-of-the-art machine learning techniques with econometrics to bring automation to complex causal inference problems. The promise of EconML:

  • Implement recent techniques in the literature at the intersection of econometrics and machine learning
  • Maintain flexibility in modeling the effect heterogeneity (via techniques such as random forests, boosting, lasso and neural nets), while preserving the causal interpretation of the learned model and often offering valid confidence intervals
  • Use a unified API
  • Build on standard Python packages for Machine Learning and Data Analysis

One of the biggest promises of machine learning is to automate decision making in a multitude of domains. At the core of many data-driven personalized decision scenarios is the estimation of heterogeneous treatment effects: what is the causal effect of an intervention on an outcome of interest for a sample with a particular set of features? In a nutshell, this toolkit is designed to measure the causal effect of some treatment variable(s) T on an outcome variable Y, controlling for a set of features X, W and how does that effect vary as a function of X. The methods implemented are applicable even with observational (non-experimental or historical) datasets. For the estimation results to have a causal interpretation, some methods assume no unobserved confounders (i.e. there is no unobserved variable not included in X, W that simultaneously has an effect on both T and Y), while others assume access to an instrument Z (i.e. an observed variable Z that has an effect on the treatment T but no direct effect on the outcome Y). Most methods provide confidence intervals and inference results.

For detailed information about the package, consult the documentation at https://econml.azurewebsites.net/.

For information on use cases and background material on causal inference and heterogeneous treatment effects see our webpage at https://www.microsoft.com/en-us/research/project/econml/

Table of Contents

News

July 9, 2021: Release v0.12.0b4, see release notes here

Previous releases

June 25, 2021: Release v0.12.0b3, see release notes here

June 18, 2021: Release v0.12.0b2, see release notes here

June 7, 2021: Release v0.12.0b1, see release notes here

May 18, 2021: Release v0.11.1, see release notes here

May 8, 2021: Release v0.11.0, see release notes here

March 22, 2021: Release v0.10.0, see release notes here

March 11, 2021: Release v0.9.2, see release notes here

March 3, 2021: Release v0.9.1, see release notes here

February 20, 2021: Release v0.9.0, see release notes here

January 20, 2021: Release v0.9.0b1, see release notes here

November 20, 2020: Release v0.8.1, see release notes here

November 18, 2020: Release v0.8.0, see release notes here

September 4, 2020: Release v0.8.0b1, see release notes here

March 6, 2020: Release v0.7.0, see release notes here

February 18, 2020: Release v0.7.0b1, see release notes here

January 10, 2020: Release v0.6.1, see release notes here

December 6, 2019: Release v0.6, see release notes here

November 21, 2019: Release v0.5, see release notes here.

June 3, 2019: Release v0.4, see release notes here.

May 3, 2019: Release v0.3, see release notes here.

April 10, 2019: Release v0.2, see release notes here.

March 6, 2019: Release v0.1, welcome to have a try and provide feedback.

Getting Started

Installation

Install the latest release from PyPI:

pip install econml

To install from source, see For Developers section below.

Usage Examples

Estimation Methods

Double Machine Learning (aka RLearner) (click to expand)
  • Linear final stage
from econml.dml import LinearDML
from sklearn.linear_model import LassoCV
from econml.inference import BootstrapInference

est = LinearDML(model_y=LassoCV(), model_t=LassoCV())
### Estimate with OLS confidence intervals
est.fit(Y, T, X=X, W=W) # W -> high-dimensional confounders, X -> features
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05) # OLS confidence intervals

### Estimate with bootstrap confidence intervals
est.fit(Y, T, X=X, W=W, inference='bootstrap')  # with default bootstrap parameters
est.fit(Y, T, X=X, W=W, inference=BootstrapInference(n_bootstrap_samples=100))  # or customized
lb, ub = est.effect_interval(X_test, alpha=0.05) # Bootstrap confidence intervals
  • Sparse linear final stage
from econml.dml import SparseLinearDML
from sklearn.linear_model import LassoCV

est = SparseLinearDML(model_y=LassoCV(), model_t=LassoCV())
est.fit(Y, T, X=X, W=W) # X -> high dimensional features
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05) # Confidence intervals via debiased lasso
  • Generic Machine Learning last stage
from econml.dml import NonParamDML
from sklearn.ensemble import RandomForestRegressor, RandomForestClassifier

est = NonParamDML(model_y=RandomForestRegressor(),
                  model_t=RandomForestClassifier(),
                  model_final=RandomForestRegressor(),
                  discrete_treatment=True)
est.fit(Y, T, X=X, W=W) 
treatment_effects = est.effect(X_test)
Causal Forests (click to expand)
from econml.dml import CausalForestDML
from sklearn.linear_model import LassoCV
# Use defaults
est = CausalForestDML()
# Or specify hyperparameters
est = CausalForestDML(criterion='het', n_estimators=500,       
                      min_samples_leaf=10, 
                      max_depth=10, max_samples=0.5,
                      discrete_treatment=False,
                      model_t=LassoCV(), model_y=LassoCV())
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
# Confidence intervals via Bootstrap-of-Little-Bags for forests
lb, ub = est.effect_interval(X_test, alpha=0.05)
Orthogonal Random Forests (click to expand)
from econml.orf import DMLOrthoForest, DROrthoForest
from econml.sklearn_extensions.linear_model import WeightedLasso, WeightedLassoCV
# Use defaults
est = DMLOrthoForest()
est = DROrthoForest()
# Or specify hyperparameters
est = DMLOrthoForest(n_trees=500, min_leaf_size=10,
                     max_depth=10, subsample_ratio=0.7,
                     lambda_reg=0.01,
                     discrete_treatment=False,
                     model_T=WeightedLasso(alpha=0.01), model_Y=WeightedLasso(alpha=0.01),
                     model_T_final=WeightedLassoCV(cv=3), model_Y_final=WeightedLassoCV(cv=3))
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
# Confidence intervals via Bootstrap-of-Little-Bags for forests
lb, ub = est.effect_interval(X_test, alpha=0.05)
Meta-Learners (click to expand)
  • XLearner
from econml.metalearners import XLearner
from sklearn.ensemble import GradientBoostingClassifier, GradientBoostingRegressor

est = XLearner(models=GradientBoostingRegressor(),
              propensity_model=GradientBoostingClassifier(),
              cate_models=GradientBoostingRegressor())
est.fit(Y, T, X=np.hstack([X, W]))
treatment_effects = est.effect(np.hstack([X_test, W_test]))

# Fit with bootstrap confidence interval construction enabled
est.fit(Y, T, X=np.hstack([X, W]), inference='bootstrap')
treatment_effects = est.effect(np.hstack([X_test, W_test]))
lb, ub = est.effect_interval(np.hstack([X_test, W_test]), alpha=0.05) # Bootstrap CIs
  • SLearner
from econml.metalearners import SLearner
from sklearn.ensemble import GradientBoostingRegressor

est = SLearner(overall_model=GradientBoostingRegressor())
est.fit(Y, T, X=np.hstack([X, W]))
treatment_effects = est.effect(np.hstack([X_test, W_test]))
  • TLearner
from econml.metalearners import TLearner
from sklearn.ensemble import GradientBoostingRegressor

est = TLearner(models=GradientBoostingRegressor())
est.fit(Y, T, X=np.hstack([X, W]))
treatment_effects = est.effect(np.hstack([X_test, W_test]))
Doubly Robust Learners (click to expand)
  • Linear final stage
from econml.dr import LinearDRLearner
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier

est = LinearDRLearner(model_propensity=GradientBoostingClassifier(),
                      model_regression=GradientBoostingRegressor())
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05)
  • Sparse linear final stage
from econml.dr import SparseLinearDRLearner
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier

est = SparseLinearDRLearner(model_propensity=GradientBoostingClassifier(),
                            model_regression=GradientBoostingRegressor())
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05)
  • Nonparametric final stage
from econml.dr import ForestDRLearner
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier

est = ForestDRLearner(model_propensity=GradientBoostingClassifier(),
                      model_regression=GradientBoostingRegressor())
est.fit(Y, T, X=X, W=W) 
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05)
Orthogonal Instrumental Variables (click to expand)
  • Intent to Treat Doubly Robust Learner (discrete instrument, discrete treatment)
from econml.iv.dr import LinearIntentToTreatDRIV
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier
from sklearn.linear_model import LinearRegression

est = LinearIntentToTreatDRIV(model_Y_X=GradientBoostingRegressor(),
                              model_T_XZ=GradientBoostingClassifier(),
                              flexible_model_effect=GradientBoostingRegressor())
est.fit(Y, T, Z=Z, X=X) # OLS inference by default
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05) # OLS confidence intervals
Deep Instrumental Variables (click to expand)
import keras
from econml.iv.nnet import DeepIV

treatment_model = keras.Sequential([keras.layers.Dense(128, activation='relu', input_shape=(2,)),
                                    keras.layers.Dropout(0.17),
                                    keras.layers.Dense(64, activation='relu'),
                                    keras.layers.Dropout(0.17),
                                    keras.layers.Dense(32, activation='relu'),
                                    keras.layers.Dropout(0.17)])
response_model = keras.Sequential([keras.layers.Dense(128, activation='relu', input_shape=(2,)),
                                  keras.layers.Dropout(0.17),
                                  keras.layers.Dense(64, activation='relu'),
                                  keras.layers.Dropout(0.17),
                                  keras.layers.Dense(32, activation='relu'),
                                  keras.layers.Dropout(0.17),
                                  keras.layers.Dense(1)])
est = DeepIV(n_components=10, # Number of gaussians in the mixture density networks)
             m=lambda z, x: treatment_model(keras.layers.concatenate([z, x])), # Treatment model
             h=lambda t, x: response_model(keras.layers.concatenate([t, x])), # Response model
             n_samples=1 # Number of samples used to estimate the response
             )
est.fit(Y, T, X=X, Z=Z) # Z -> instrumental variables
treatment_effects = est.effect(X_test)

See the References section for more details.

Interpretability

Tree Interpreter of the CATE model (click to expand)
from econml.cate_interpreter import SingleTreeCateInterpreter
intrp = SingleTreeCateInterpreter(include_model_uncertainty=True, max_depth=2, min_samples_leaf=10)
# We interpret the CATE model's behavior based on the features used for heterogeneity
intrp.interpret(est, X)
# Plot the tree
plt.figure(figsize=(25, 5))
intrp.plot(feature_names=['A', 'B', 'C', 'D'], fontsize=12)
plt.show()

image

Policy Interpreter of the CATE model (click to expand)
from econml.cate_interpreter import SingleTreePolicyInterpreter
# We find a tree-based treatment policy based on the CATE model
intrp = SingleTreePolicyInterpreter(risk_level=0.05, max_depth=2, min_samples_leaf=1,min_impurity_decrease=.001)
intrp.interpret(est, X, sample_treatment_costs=0.2)
# Plot the tree
plt.figure(figsize=(25, 5))
intrp.plot(feature_names=['A', 'B', 'C', 'D'], fontsize=12)
plt.show()

image

SHAP values for the CATE model (click to expand)
import shap
from econml.dml import CausalForestDML
est = CausalForestDML()
est.fit(Y, T, X=X, W=W)
shap_values = est.shap_values(X)
shap.summary_plot(shap_values['Y0']['T0'])

Causal Model Selection and Cross-Validation

Causal model selection with the `RScorer` (click to expand)
from econml.score import Rscorer

# split data in train-validation
X_train, X_val, T_train, T_val, Y_train, Y_val = train_test_split(X, T, y, test_size=.4)

# define list of CATE estimators to select among
reg = lambda: RandomForestRegressor(min_samples_leaf=20)
clf = lambda: RandomForestClassifier(min_samples_leaf=20)
models = [('ldml', LinearDML(model_y=reg(), model_t=clf(), discrete_treatment=True,
                             linear_first_stages=False, cv=3)),
          ('xlearner', XLearner(models=reg(), cate_models=reg(), propensity_model=clf())),
          ('dalearner', DomainAdaptationLearner(models=reg(), final_models=reg(), propensity_model=clf())),
          ('slearner', SLearner(overall_model=reg())),
          ('drlearner', DRLearner(model_propensity=clf(), model_regression=reg(),
                                  model_final=reg(), cv=3)),
          ('rlearner', NonParamDML(model_y=reg(), model_t=clf(), model_final=reg(),
                                   discrete_treatment=True, cv=3)),
          ('dml3dlasso', DML(model_y=reg(), model_t=clf(),
                             model_final=LassoCV(cv=3, fit_intercept=False),
                             discrete_treatment=True,
                             featurizer=PolynomialFeatures(degree=3),
                             linear_first_stages=False, cv=3))
]

# fit cate models on train data
models = [(name, mdl.fit(Y_train, T_train, X=X_train)) for name, mdl in models]

# score cate models on validation data
scorer = RScorer(model_y=reg(), model_t=clf(),
                 discrete_treatment=True, cv=3, mc_iters=2, mc_agg='median')
scorer.fit(Y_val, T_val, X=X_val)
rscore = [scorer.score(mdl) for _, mdl in models]
# select the best model
mdl, _ = scorer.best_model([mdl for _, mdl in models])
# create weighted ensemble model based on score performance
mdl, _ = scorer.ensemble([mdl for _, mdl in models])
First Stage Model Selection (click to expand)

First stage models can be selected either by passing in cross-validated models (e.g. sklearn.linear_model.LassoCV) to EconML's estimators or perform the first stage model selection outside of EconML and pass in the selected model. Unless selecting among a large set of hyperparameters, choosing first stage models externally is the preferred method due to statistical and computational advantages.

from econml.dml import LinearDML
from sklearn import clone
from sklearn.ensemble import RandomForestRegressor
from sklearn.model_selection import GridSearchCV

cv_model = GridSearchCV(
              estimator=RandomForestRegressor(),
              param_grid={
                  "max_depth": [3, None],
                  "n_estimators": (10, 30, 50, 100, 200),
                  "max_features": (2, 4, 6),
              },
              cv=5,
           )
# First stage model selection within EconML
# This is more direct, but computationally and statistically less efficient
est = LinearDML(model_y=cv_model, model_t=cv_model)
# First stage model selection ouside of EconML
# This is the most efficient, but requires boilerplate code
model_t = clone(cv_model).fit(W, T).best_estimator_
model_y = clone(cv_model).fit(W, Y).best_estimator_
est = LinearDML(model_y=model_t, model_t=model_y)

Inference

Whenever inference is enabled, then one can get a more structure InferenceResults object with more elaborate inference information, such as p-values and z-statistics. When the CATE model is linear and parametric, then a summary() method is also enabled. For instance:

from econml.dml import LinearDML
# Use defaults
est = LinearDML()
est.fit(Y, T, X=X, W=W)
# Get the effect inference summary, which includes the standard error, z test score, p value, and confidence interval given each sample X[i]
est.effect_inference(X_test).summary_frame(alpha=0.05, value=0, decimals=3)
# Get the population summary for the entire sample X
est.effect_inference(X_test).population_summary(alpha=0.1, value=0, decimals=3, tol=0.001)
#  Get the parameter inference summary for the final model
est.summary()
Example Output (click to expand)
# Get the effect inference summary, which includes the standard error, z test score, p value, and confidence interval given each sample X[i]
est.effect_inference(X_test).summary_frame(alpha=0.05, value=0, decimals=3)

image

# Get the population summary for the entire sample X
est.effect_inference(X_test).population_summary(alpha=0.1, value=0, decimals=3, tol=0.001)

image

#  Get the parameter inference summary for the final model
est.summary()

image

Policy Learning

You can also perform direct policy learning from observational data, using the doubly robust method for offline policy learning. These methods directly predict a recommended treatment, without internally fitting an explicit model of the conditional average treatment effect.

Doubly Robust Policy Learning (click to expand)
from econml.policy import DRPolicyTree, DRPolicyForest
from sklearn.ensemble import RandomForestRegressor

# fit a single binary decision tree policy
policy = DRPolicyTree(max_depth=1, min_impurity_decrease=0.01, honest=True)
policy.fit(y, T, X=X, W=W)
# predict the recommended treatment
recommended_T = policy.predict(X)
# plot the binary decision tree
plt.figure(figsize=(10,5))
policy.plot()
# get feature importances
importances = policy.feature_importances_

# fit a binary decision forest
policy = DRPolicyForest(max_depth=1, min_impurity_decrease=0.01, honest=True)
policy.fit(y, T, X=X, W=W)
# predict the recommended treatment
recommended_T = policy.predict(X)
# plot the first tree in the ensemble
plt.figure(figsize=(10,5))
policy.plot(0)
# get feature importances
importances = policy.feature_importances_

image

To see more complex examples, go to the notebooks section of the repository. For a more detailed description of the treatment effect estimation algorithms, see the EconML documentation.

For Developers

You can get started by cloning this repository. We use setuptools for building and distributing our package. We rely on some recent features of setuptools, so make sure to upgrade to a recent version with pip install setuptools --upgrade. Then from your local copy of the repository you can run pip install -e . to get started (but depending on what you're doing you might want to install with extras instead, like pip install -e .[plt] if you want to use matplotlib integration, or you can use pip install -e .[all] to include all extras).

Running the tests

This project uses pytest for testing. To run tests locally after installing the package, you can use pip install pytest-runner followed by python setup.py pytest.

We have added pytest marks to some tests to make it easier to run a subset, and you can set the PYTEST_ADDOPTS environment variable to take advantage of this. For instance, you can set it to -m "not (notebook or automl)" to skip notebook and automl tests that have some additional dependencies.

Generating the documentation

This project's documentation is generated via Sphinx. Note that we use graphviz's dot application to produce some of the images in our documentation, so you should make sure that dot is installed and in your path.

To generate a local copy of the documentation from a clone of this repository, just run python setup.py build_sphinx -W -E -a, which will build the documentation and place it under the build/sphinx/html path.

The reStructuredText files that make up the documentation are stored in the docs directory; module documentation is automatically generated by the Sphinx build process.

Blogs and Publications

Citation

If you use EconML in your research, please cite us as follows:

Keith Battocchi, Eleanor Dillon, Maggie Hei, Greg Lewis, Paul Oka, Miruna Oprescu, Vasilis Syrgkanis. EconML: A Python Package for ML-Based Heterogeneous Treatment Effects Estimation. https://github.com/microsoft/EconML, 2019. Version 0.x.

BibTex:

@misc{econml,
  author={Keith Battocchi, Eleanor Dillon, Maggie Hei, Greg Lewis, Paul Oka, Miruna Oprescu, Vasilis Syrgkanis},
  title={{EconML}: {A Python Package for ML-Based Heterogeneous Treatment Effects Estimation}},
  howpublished={https://github.com/microsoft/EconML},
  note={Version 0.x},
  year={2019}
}

Contributing and Feedback

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

References

Athey, Susan, and Stefan Wager. Policy learning with observational data. Econometrica 89.1 (2021): 133-161.

X Nie, S Wager. Quasi-Oracle Estimation of Heterogeneous Treatment Effects. Biometrika, 2020

V. Syrgkanis, V. Lei, M. Oprescu, M. Hei, K. Battocchi, G. Lewis. Machine Learning Estimation of Heterogeneous Treatment Effects with Instruments. Proceedings of the 33rd Conference on Neural Information Processing Systems (NeurIPS), 2019 (Spotlight Presentation)

D. Foster, V. Syrgkanis. Orthogonal Statistical Learning. Proceedings of the 32nd Annual Conference on Learning Theory (COLT), 2019 (Best Paper Award)

M. Oprescu, V. Syrgkanis and Z. S. Wu. Orthogonal Random Forest for Causal Inference. Proceedings of the 36th International Conference on Machine Learning (ICML), 2019.

S. Künzel, J. Sekhon, J. Bickel and B. Yu. Metalearners for estimating heterogeneous treatment effects using machine learning. Proceedings of the national academy of sciences, 116(10), 4156-4165, 2019.

S. Athey, J. Tibshirani, S. Wager. Generalized random forests. Annals of Statistics, 47, no. 2, 1148--1178, 2019.

V. Chernozhukov, D. Nekipelov, V. Semenova, V. Syrgkanis. Plug-in Regularized Estimation of High-Dimensional Parameters in Nonlinear Semiparametric Models. Arxiv preprint arxiv:1806.04823, 2018.

S. Wager, S. Athey. Estimation and Inference of Heterogeneous Treatment Effects using Random Forests. Journal of the American Statistical Association, 113:523, 1228-1242, 2018.

Jason Hartford, Greg Lewis, Kevin Leyton-Brown, and Matt Taddy. Deep IV: A flexible approach for counterfactual prediction. Proceedings of the 34th International Conference on Machine Learning, ICML'17, 2017.

V. Chernozhukov, D. Chetverikov, M. Demirer, E. Duflo, C. Hansen, and a. W. Newey. Double Machine Learning for Treatment and Causal Parameters. ArXiv preprint arXiv:1608.00060, 2016.

Dudik, M., Erhan, D., Langford, J., & Li, L. Doubly robust policy evaluation and optimization. Statistical Science, 29(4), 485-511, 2014.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

econml-0.12.0b4.tar.gz (1.3 MB view details)

Uploaded Source

Built Distributions

econml-0.12.0b4-cp38-cp38-win_amd64.whl (910.0 kB view details)

Uploaded CPython 3.8 Windows x86-64

econml-0.12.0b4-cp38-cp38-win32.whl (808.6 kB view details)

Uploaded CPython 3.8 Windows x86

econml-0.12.0b4-cp38-cp38-manylinux2010_x86_64.whl (3.3 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ x86-64

econml-0.12.0b4-cp38-cp38-manylinux2010_i686.whl (3.1 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ i686

econml-0.12.0b4-cp38-cp38-manylinux1_x86_64.whl (3.3 MB view details)

Uploaded CPython 3.8

econml-0.12.0b4-cp38-cp38-manylinux1_i686.whl (3.1 MB view details)

Uploaded CPython 3.8

econml-0.12.0b4-cp38-cp38-macosx_10_9_x86_64.whl (917.3 kB view details)

Uploaded CPython 3.8 macOS 10.9+ x86-64

econml-0.12.0b4-cp37-cp37m-win_amd64.whl (900.9 kB view details)

Uploaded CPython 3.7m Windows x86-64

econml-0.12.0b4-cp37-cp37m-win32.whl (799.6 kB view details)

Uploaded CPython 3.7m Windows x86

econml-0.12.0b4-cp37-cp37m-manylinux2010_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ x86-64

econml-0.12.0b4-cp37-cp37m-manylinux2010_i686.whl (2.8 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ i686

econml-0.12.0b4-cp37-cp37m-manylinux1_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.7m

econml-0.12.0b4-cp37-cp37m-manylinux1_i686.whl (2.8 MB view details)

Uploaded CPython 3.7m

econml-0.12.0b4-cp37-cp37m-macosx_10_9_x86_64.whl (918.8 kB view details)

Uploaded CPython 3.7m macOS 10.9+ x86-64

econml-0.12.0b4-cp36-cp36m-win_amd64.whl (900.5 kB view details)

Uploaded CPython 3.6m Windows x86-64

econml-0.12.0b4-cp36-cp36m-win32.whl (799.3 kB view details)

Uploaded CPython 3.6m Windows x86

econml-0.12.0b4-cp36-cp36m-manylinux2010_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.6m manylinux: glibc 2.12+ x86-64

econml-0.12.0b4-cp36-cp36m-manylinux2010_i686.whl (2.8 MB view details)

Uploaded CPython 3.6m manylinux: glibc 2.12+ i686

econml-0.12.0b4-cp36-cp36m-manylinux1_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.6m

econml-0.12.0b4-cp36-cp36m-manylinux1_i686.whl (2.8 MB view details)

Uploaded CPython 3.6m

econml-0.12.0b4-cp36-cp36m-macosx_10_9_x86_64.whl (922.0 kB view details)

Uploaded CPython 3.6m macOS 10.9+ x86-64

File details

Details for the file econml-0.12.0b4.tar.gz.

File metadata

  • Download URL: econml-0.12.0b4.tar.gz
  • Upload date:
  • Size: 1.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4.tar.gz
Algorithm Hash digest
SHA256 261760293bb3d07dd3ae062621874fd5399745af755fd46d0cec6f014ca552d2
MD5 4f76023bf526fb9a227f0e8d95ef3943
BLAKE2b-256 ccbaf12caf08edb591f3964c8e79e274de2c3da46033512e8a8bb76da0d1d063

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 910.0 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 f7ac04cb2fa2bef26b52c015fa3ad071c5df9c730a2c72873af89b7295a56765
MD5 5fd748815c9b9863113a860894b3a7c5
BLAKE2b-256 bddf914b04167d109b8343e6ba1b899548b2b07de1bbeb1f629b1422530f3e11

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp38-cp38-win32.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp38-cp38-win32.whl
  • Upload date:
  • Size: 808.6 kB
  • Tags: CPython 3.8, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp38-cp38-win32.whl
Algorithm Hash digest
SHA256 9215c92d747af5ec50730776bd0efdd035a681ebef3e4a15e8dc664f31571b05
MD5 ba14740e015a3bd0d7a9c2fd5d7458d2
BLAKE2b-256 3200000967b562acf3aacab43d515e0b012b9978036ebfe3fb31a82cb4c22d36

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp38-cp38-manylinux2010_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp38-cp38-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.8, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp38-cp38-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 c21d38d97175c006f21d19ebc70788c7f5b66625c45fcbebfda2ae975e2cb9ef
MD5 c39c4155e2eeb67db252de661bc5c3e8
BLAKE2b-256 8682c8b04adc3de9e0f51b7ab464bc6e5119b00cd4c3636319bf524d9db53fe2

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp38-cp38-manylinux2010_i686.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp38-cp38-manylinux2010_i686.whl
  • Upload date:
  • Size: 3.1 MB
  • Tags: CPython 3.8, manylinux: glibc 2.12+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp38-cp38-manylinux2010_i686.whl
Algorithm Hash digest
SHA256 4eb797ccada2cc76903ffdad52993b4627cf8c2093c664f1edbf731b415bb75d
MD5 5f20299a6da6e6650241b267c1709989
BLAKE2b-256 2fc07f3a6f4e36be9ddcb331212ea964d89765ccc23d11683027ccadba68e6dc

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp38-cp38-manylinux1_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp38-cp38-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp38-cp38-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 6dd4e800b1e636817a07f81242f9ccf1267f1087f7b85de5343a8476b07ef861
MD5 9928fabe5f656debc35cbaa2f12bdc88
BLAKE2b-256 30c8e4e0830611eb9da8ab1f7d2111e2d3420d4c9031ffbd84eef926c4db86f3

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp38-cp38-manylinux1_i686.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp38-cp38-manylinux1_i686.whl
  • Upload date:
  • Size: 3.1 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp38-cp38-manylinux1_i686.whl
Algorithm Hash digest
SHA256 1102f84e50ae333653acc5ff505ffb657d79569c4fb0e20968713ec1b5029b50
MD5 a2cc69e5ba64bced95d2201e07a39ac9
BLAKE2b-256 f04e2321194bd86249fd4dd6642db181b4be0dd4da5cd14825add6957e5fbe05

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp38-cp38-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 917.3 kB
  • Tags: CPython 3.8, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 723279f6cfa02088b2fe210a5cd27c219328223699503fe3816d46b529906769
MD5 d05ab2fccab18de29667794afc8388d8
BLAKE2b-256 ef7d80034ba2f8a2e067a10493d9652208af18a9940d54bbb145ba064f874c0a

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 900.9 kB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 ec177cd27a7ffd4f6f4d3d6fb877a46e23d7c66f84d5198019fb6791f53c415a
MD5 3c7cd2d3950603c10888514e11bb608b
BLAKE2b-256 f61219e78e117576f19338b4fb6e34f0f66cb3264a18e92600dfb4a7d7457667

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp37-cp37m-win32.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp37-cp37m-win32.whl
  • Upload date:
  • Size: 799.6 kB
  • Tags: CPython 3.7m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp37-cp37m-win32.whl
Algorithm Hash digest
SHA256 122b3b0b8b87b3759a9eee87f911cc926e52b6b8bbf09504ca58e63ea3595602
MD5 b4d3b02e58dff6fc2b3bc9200e7731c3
BLAKE2b-256 6851820daffc2e3aaa21c38d5bffe8b0988f42a798daf9bf9a5dac19a47e6ca0

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp37-cp37m-manylinux2010_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp37-cp37m-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.7m, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp37-cp37m-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 d4f22c8b885105b6341e62b959b7e982b5673d2bfc1017a1d14c4466009e8c78
MD5 dd612759924cd73f2d2a142ff3bc7d72
BLAKE2b-256 023dce24f858223bbbecc40eacc5c0ca2720b8d14b2aa8f716fa342429f8a2ea

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp37-cp37m-manylinux2010_i686.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp37-cp37m-manylinux2010_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.7m, manylinux: glibc 2.12+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp37-cp37m-manylinux2010_i686.whl
Algorithm Hash digest
SHA256 18c1322ef0cd50d73735316cbee7d47f16517468f605be895f08a6084b5ac256
MD5 f416a8ecdf0a0c80a21f84c6b2e39e54
BLAKE2b-256 c0ad3bc68148bcab340048d0a89b4300630b6d264e5862f4b523e655f15f2ad5

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 ae448abb4ebed58c9b5843ad7655f8efefbda91d3ee484dd61869b4b18d15c46
MD5 48a63b38a9f8c9e7a26e077a4f489132
BLAKE2b-256 82104708f593c04ea1b0d9297352bf86297441537b78ad788cf4637dd8bae9fd

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp37-cp37m-manylinux1_i686.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp37-cp37m-manylinux1_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp37-cp37m-manylinux1_i686.whl
Algorithm Hash digest
SHA256 7000d16b4f38e9e25f0295e604d6111ac0325678ca2fc7f5827604962ab9c90d
MD5 a88637186dbc55436a28b5171b89a569
BLAKE2b-256 7c61b690467d10513647c0640208b1ef4243b955cc6eaaf9f85144791799a14c

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp37-cp37m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp37-cp37m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 918.8 kB
  • Tags: CPython 3.7m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp37-cp37m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 c34d98779953aab0b8b5104e253d5fd80405ad817e65e7733cf1c8f7e284788a
MD5 ad553d908314c287da6b4b060bd622eb
BLAKE2b-256 36d755a3ac8067c152fbfe66e8ea368d8ed887aeddd8521dc1724772abe02a25

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 900.5 kB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 d38d202d2824d0580ca80cea8e98364d6e293234f25bcddf06b8fd38cb9c0a54
MD5 f2daff2c0fa0dd40118e6ff9051cd3f6
BLAKE2b-256 470a165b29ff7840e5223b7b6c6e2c25152ac873a5673030ff214983c163c381

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp36-cp36m-win32.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp36-cp36m-win32.whl
  • Upload date:
  • Size: 799.3 kB
  • Tags: CPython 3.6m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp36-cp36m-win32.whl
Algorithm Hash digest
SHA256 d9e8f33c5f2d1bbcdbe3fbaffa9f5fc5d565fa2cf485b7c90976bf861be9490a
MD5 9c0369bc0c129dce7175fe2221dc8fdc
BLAKE2b-256 b2d747c66fe4e301ac5e2d07c798ca2b7242d3d7a27c1670efdfd5c9a268070f

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp36-cp36m-manylinux2010_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp36-cp36m-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.6m, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp36-cp36m-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 74e47a307b9e6e9932da8eac476cc1d2fce2aa9d6ace717b5b10ef169e27b0b4
MD5 a436591cf59957503c476dc0627edfd0
BLAKE2b-256 f5c01ef23857a33ee0836a38d51be508ad4d9093590e1d057700bae86c89f1d4

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp36-cp36m-manylinux2010_i686.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp36-cp36m-manylinux2010_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.6m, manylinux: glibc 2.12+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp36-cp36m-manylinux2010_i686.whl
Algorithm Hash digest
SHA256 5c4df422553b4d2379511f56f7e440e2a7da6c89c80783699653c4bb23c3b06b
MD5 787ea02fc9874a9922ed59f1c1afee41
BLAKE2b-256 bae9e5cc77d2098696b2495a47178379128e4d157543c6c07965c0bfce99c7ec

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 beacdfae3e44160d4b20b0f379fcaa75dc1ab3bf76cebaa7415114d0e525ec5a
MD5 1908e0089ea6353153a8726ca6af3210
BLAKE2b-256 3a555491655be22cd02e81c3d934de020bf1e8eaa0909e3290c982d3553bdfbc

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp36-cp36m-manylinux1_i686.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp36-cp36m-manylinux1_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp36-cp36m-manylinux1_i686.whl
Algorithm Hash digest
SHA256 35cca7bc63ac9323bf487cc44c1127beb0eca9f450b12d927c6b375de1cbec1a
MD5 e54629af6292910a8a7ea85402185e8e
BLAKE2b-256 a8e7b646314409ea11f0f7b8a4fcc009bcaaf3ebea94c0f8cb12f532538b3602

See more details on using hashes here.

File details

Details for the file econml-0.12.0b4-cp36-cp36m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b4-cp36-cp36m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 922.0 kB
  • Tags: CPython 3.6m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b4-cp36-cp36m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 c81f6e7f7de0a0299eedf7db99799010476d10983473d5b83c08ebc665976772
MD5 0b4bcdc21af37aa11a612da0d0a7a774
BLAKE2b-256 46ad047f35c04d65fe245bdcc5bab47951486a0aae1bafeb63a3ad940ea38089

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page