Skip to main content

This package contains several methods for calculating Conditional Average Treatment Effects

Project description

Build Status PyPI version PyPI wheel Supported Python versions

EconML: A Python Package for ML-Based Heterogeneous Treatment Effects Estimation

EconML is a Python package for estimating heterogeneous treatment effects from observational data via machine learning. This package was designed and built as part of the ALICE project at Microsoft Research with the goal to combine state-of-the-art machine learning techniques with econometrics to bring automation to complex causal inference problems. The promise of EconML:

  • Implement recent techniques in the literature at the intersection of econometrics and machine learning
  • Maintain flexibility in modeling the effect heterogeneity (via techniques such as random forests, boosting, lasso and neural nets), while preserving the causal interpretation of the learned model and often offering valid confidence intervals
  • Use a unified API
  • Build on standard Python packages for Machine Learning and Data Analysis

One of the biggest promises of machine learning is to automate decision making in a multitude of domains. At the core of many data-driven personalized decision scenarios is the estimation of heterogeneous treatment effects: what is the causal effect of an intervention on an outcome of interest for a sample with a particular set of features? In a nutshell, this toolkit is designed to measure the causal effect of some treatment variable(s) T on an outcome variable Y, controlling for a set of features X, W and how does that effect vary as a function of X. The methods implemented are applicable even with observational (non-experimental or historical) datasets. For the estimation results to have a causal interpretation, some methods assume no unobserved confounders (i.e. there is no unobserved variable not included in X, W that simultaneously has an effect on both T and Y), while others assume access to an instrument Z (i.e. an observed variable Z that has an effect on the treatment T but no direct effect on the outcome Y). Most methods provide confidence intervals and inference results.

For detailed information about the package, consult the documentation at https://econml.azurewebsites.net/.

For information on use cases and background material on causal inference and heterogeneous treatment effects see our webpage at https://www.microsoft.com/en-us/research/project/econml/

Table of Contents

News

March 22, 2021: Release v0.10.0, see release notes here

Previous releases

March 11, 2021: Release v0.9.2, see release notes here

March 3, 2021: Release v0.9.1, see release notes here

February 20, 2021: Release v0.9.0, see release notes here

January 20, 2021: Release v0.9.0b1, see release notes here

November 20, 2020: Release v0.8.1, see release notes here

November 18, 2020: Release v0.8.0, see release notes here

September 4, 2020: Release v0.8.0b1, see release notes here

March 6, 2020: Release v0.7.0, see release notes here

February 18, 2020: Release v0.7.0b1, see release notes here

January 10, 2020: Release v0.6.1, see release notes here

December 6, 2019: Release v0.6, see release notes here

November 21, 2019: Release v0.5, see release notes here.

June 3, 2019: Release v0.4, see release notes here.

May 3, 2019: Release v0.3, see release notes here.

April 10, 2019: Release v0.2, see release notes here.

March 6, 2019: Release v0.1, welcome to have a try and provide feedback.

Getting Started

Installation

Install the latest release from PyPI:

pip install econml

To install from source, see For Developers section below.

Usage Examples

Estimation Methods

Double Machine Learning (aka RLearner) (click to expand)
  • Linear final stage
from econml.dml import LinearDML
from sklearn.linear_model import LassoCV
from econml.inference import BootstrapInference

est = LinearDML(model_y=LassoCV(), model_t=LassoCV())
### Estimate with OLS confidence intervals
est.fit(Y, T, X=X, W=W) # W -> high-dimensional confounders, X -> features
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05) # OLS confidence intervals

### Estimate with bootstrap confidence intervals
est.fit(Y, T, X=X, W=W, inference='bootstrap')  # with default bootstrap parameters
est.fit(Y, T, X=X, W=W, inference=BootstrapInference(n_bootstrap_samples=100))  # or customized
lb, ub = est.effect_interval(X_test, alpha=0.05) # Bootstrap confidence intervals
  • Sparse linear final stage
from econml.dml import SparseLinearDML
from sklearn.linear_model import LassoCV

est = SparseLinearDML(model_y=LassoCV(), model_t=LassoCV())
est.fit(Y, T, X=X, W=W) # X -> high dimensional features
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05) # Confidence intervals via debiased lasso
  • Generic Machine Learning last stage
from econml.dml import NonParamDML
from sklearn.ensemble import RandomForestRegressor, RandomForestClassifier

est = NonParamDML(model_y=RandomForestRegressor(),
                  model_t=RandomForestClassifier(),
                  model_final=RandomForestRegressor(),
                  discrete_treatment=True)
est.fit(Y, T, X=X, W=W) 
treatment_effects = est.effect(X_test)
Causal Forests (click to expand)
from econml.dml import CausalForestDML
from sklearn.linear_model import LassoCV
# Use defaults
est = CausalForestDML()
# Or specify hyperparameters
est = CausalForestDML(criterion='het', n_estimators=500,       
                      min_samples_leaf=10, 
                      max_depth=10, max_samples=0.5,
                      discrete_treatment=False,
                      model_t=LassoCV(), model_y=LassoCV())
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
# Confidence intervals via Bootstrap-of-Little-Bags for forests
lb, ub = est.effect_interval(X_test, alpha=0.05)
Orthogonal Random Forests (click to expand)
from econml.orf import DMLOrthoForest, DROrthoForest
from econml.sklearn_extensions.linear_model import WeightedLasso, WeightedLassoCV
# Use defaults
est = DMLOrthoForest()
est = DROrthoForest()
# Or specify hyperparameters
est = DMLOrthoForest(n_trees=500, min_leaf_size=10,
                     max_depth=10, subsample_ratio=0.7,
                     lambda_reg=0.01,
                     discrete_treatment=False,
                     model_T=WeightedLasso(alpha=0.01), model_Y=WeightedLasso(alpha=0.01),
                     model_T_final=WeightedLassoCV(cv=3), model_Y_final=WeightedLassoCV(cv=3))
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
# Confidence intervals via Bootstrap-of-Little-Bags for forests
lb, ub = est.effect_interval(X_test, alpha=0.05)
Meta-Learners (click to expand)
  • XLearner
from econml.metalearners import XLearner
from sklearn.ensemble import GradientBoostingClassifier, GradientBoostingRegressor

est = XLearner(models=GradientBoostingRegressor(),
              propensity_model=GradientBoostingClassifier(),
              cate_models=GradientBoostingRegressor())
est.fit(Y, T, X=np.hstack([X, W]))
treatment_effects = est.effect(np.hstack([X_test, W_test]))

# Fit with bootstrap confidence interval construction enabled
est.fit(Y, T, X=np.hstack([X, W]), inference='bootstrap')
treatment_effects = est.effect(np.hstack([X_test, W_test]))
lb, ub = est.effect_interval(np.hstack([X_test, W_test]), alpha=0.05) # Bootstrap CIs
  • SLearner
from econml.metalearners import SLearner
from sklearn.ensemble import GradientBoostingRegressor

est = SLearner(overall_model=GradientBoostingRegressor())
est.fit(Y, T, X=np.hstack([X, W]))
treatment_effects = est.effect(np.hstack([X_test, W_test]))
  • TLearner
from econml.metalearners import TLearner
from sklearn.ensemble import GradientBoostingRegressor

est = TLearner(models=GradientBoostingRegressor())
est.fit(Y, T, X=np.hstack([X, W]))
treatment_effects = est.effect(np.hstack([X_test, W_test]))
Doubly Robust Learners (click to expand)
  • Linear final stage
from econml.dr import LinearDRLearner
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier

est = LinearDRLearner(model_propensity=GradientBoostingClassifier(),
                      model_regression=GradientBoostingRegressor())
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05)
  • Sparse linear final stage
from econml.dr import SparseLinearDRLearner
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier

est = SparseLinearDRLearner(model_propensity=GradientBoostingClassifier(),
                            model_regression=GradientBoostingRegressor())
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05)
  • Nonparametric final stage
from econml.dr import ForestDRLearner
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier

est = ForestDRLearner(model_propensity=GradientBoostingClassifier(),
                      model_regression=GradientBoostingRegressor())
est.fit(Y, T, X=X, W=W) 
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05)
Orthogonal Instrumental Variables (click to expand)
  • Intent to Treat Doubly Robust Learner (discrete instrument, discrete treatment)
from econml.iv.dr import LinearIntentToTreatDRIV
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier
from sklearn.linear_model import LinearRegression

est = LinearIntentToTreatDRIV(model_Y_X=GradientBoostingRegressor(),
                              model_T_XZ=GradientBoostingClassifier(),
                              flexible_model_effect=GradientBoostingRegressor())
est.fit(Y, T, Z=Z, X=X) # OLS inference by default
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05) # OLS confidence intervals
Deep Instrumental Variables (click to expand)
import keras
from econml.iv.nnet import DeepIV

treatment_model = keras.Sequential([keras.layers.Dense(128, activation='relu', input_shape=(2,)),
                                    keras.layers.Dropout(0.17),
                                    keras.layers.Dense(64, activation='relu'),
                                    keras.layers.Dropout(0.17),
                                    keras.layers.Dense(32, activation='relu'),
                                    keras.layers.Dropout(0.17)])
response_model = keras.Sequential([keras.layers.Dense(128, activation='relu', input_shape=(2,)),
                                  keras.layers.Dropout(0.17),
                                  keras.layers.Dense(64, activation='relu'),
                                  keras.layers.Dropout(0.17),
                                  keras.layers.Dense(32, activation='relu'),
                                  keras.layers.Dropout(0.17),
                                  keras.layers.Dense(1)])
est = DeepIV(n_components=10, # Number of gaussians in the mixture density networks)
             m=lambda z, x: treatment_model(keras.layers.concatenate([z, x])), # Treatment model
             h=lambda t, x: response_model(keras.layers.concatenate([t, x])), # Response model
             n_samples=1 # Number of samples used to estimate the response
             )
est.fit(Y, T, X=X, Z=Z) # Z -> instrumental variables
treatment_effects = est.effect(X_test)

See the References section for more details.

Interpretability

Tree Interpreter of the CATE model (click to expand)
from econml.cate_interpreter import SingleTreeCateInterpreter
intrp = SingleTreeCateInterpreter(include_model_uncertainty=True, max_depth=2, min_samples_leaf=10)
# We interpret the CATE model's behavior based on the features used for heterogeneity
intrp.interpret(est, X)
# Plot the tree
plt.figure(figsize=(25, 5))
intrp.plot(feature_names=['A', 'B', 'C', 'D'], fontsize=12)
plt.show()

image

Policy Interpreter of the CATE model (click to expand)
from econml.cate_interpreter import SingleTreePolicyInterpreter
# We find a tree-based treatment policy based on the CATE model
intrp = SingleTreePolicyInterpreter(risk_level=0.05, max_depth=2, min_samples_leaf=1,min_impurity_decrease=.001)
intrp.interpret(est, X, sample_treatment_costs=0.2)
# Plot the tree
plt.figure(figsize=(25, 5))
intrp.plot(feature_names=['A', 'B', 'C', 'D'], fontsize=12)
plt.show()

image

SHAP values for the CATE model (click to expand)
import shap
from econml.dml import CausalForestDML
est = CausalForestDML()
est.fit(Y, T, X=X, W=W)
shap_values = est.shap_values(X)
shap.summary_plot(shap_values['Y0']['T0'])

Causal Model Selection and Cross-Validation

Causal model selection with the `RScorer` (click to expand)
from econml.score import Rscorer

# split data in train-validation
X_train, X_val, T_train, T_val, Y_train, Y_val = train_test_split(X, T, y, test_size=.4)

# define list of CATE estimators to select among
reg = lambda: RandomForestRegressor(min_samples_leaf=20)
clf = lambda: RandomForestClassifier(min_samples_leaf=20)
models = [('ldml', LinearDML(model_y=reg(), model_t=clf(), discrete_treatment=True,
                             linear_first_stages=False, cv=3)),
          ('xlearner', XLearner(models=reg(), cate_models=reg(), propensity_model=clf())),
          ('dalearner', DomainAdaptationLearner(models=reg(), final_models=reg(), propensity_model=clf())),
          ('slearner', SLearner(overall_model=reg())),
          ('drlearner', DRLearner(model_propensity=clf(), model_regression=reg(),
                                  model_final=reg(), cv=3)),
          ('rlearner', NonParamDML(model_y=reg(), model_t=clf(), model_final=reg(),
                                   discrete_treatment=True, cv=3)),
          ('dml3dlasso', DML(model_y=reg(), model_t=clf(),
                             model_final=LassoCV(cv=3, fit_intercept=False),
                             discrete_treatment=True,
                             featurizer=PolynomialFeatures(degree=3),
                             linear_first_stages=False, cv=3))
]

# fit cate models on train data
models = [(name, mdl.fit(Y_train, T_train, X=X_train)) for name, mdl in models]

# score cate models on validation data
scorer = RScorer(model_y=reg(), model_t=clf(),
                 discrete_treatment=True, cv=3, mc_iters=2, mc_agg='median')
scorer.fit(Y_val, T_val, X=X_val)
rscore = [scorer.score(mdl) for _, mdl in models]
# select the best model
mdl, _ = scorer.best_model([mdl for _, mdl in models])
# create weighted ensemble model based on score performance
mdl, _ = scorer.ensemble([mdl for _, mdl in models])
First Stage Model Selection (click to expand)

First stage models can be selected either by passing in cross-validated models (e.g. sklearn.linear_model.LassoCV) to EconML's estimators or perform the first stage model selection outside of EconML and pass in the selected model. Unless selecting among a large set of hyperparameters, choosing first stage models externally is the preferred method due to statistical and computational advantages.

from econml.dml import LinearDML
from sklearn import clone
from sklearn.ensemble import RandomForestRegressor
from sklearn.model_selection import GridSearchCV

cv_model = GridSearchCV(
              estimator=RandomForestRegressor(),
              param_grid={
                  "max_depth": [3, None],
                  "n_estimators": (10, 30, 50, 100, 200),
                  "max_features": (2, 4, 6),
              },
              cv=5,
           )
# First stage model selection within EconML
# This is more direct, but computationally and statistically less efficient
est = LinearDML(model_y=cv_model, model_t=cv_model)
# First stage model selection ouside of EconML
# This is the most efficient, but requires boilerplate code
model_t = clone(cv_model).fit(W, T).best_estimator_
model_y = clone(cv_model).fit(W, Y).best_estimator_
est = LinearDML(model_y=model_t, model_t=model_y)

Inference

Whenever inference is enabled, then one can get a more structure InferenceResults object with more elaborate inference information, such as p-values and z-statistics. When the CATE model is linear and parametric, then a summary() method is also enabled. For instance:

from econml.dml import LinearDML
# Use defaults
est = LinearDML()
est.fit(Y, T, X=X, W=W)
# Get the effect inference summary, which includes the standard error, z test score, p value, and confidence interval given each sample X[i]
est.effect_inference(X_test).summary_frame(alpha=0.05, value=0, decimals=3)
# Get the population summary for the entire sample X
est.effect_inference(X_test).population_summary(alpha=0.1, value=0, decimals=3, tol=0.001)
#  Get the parameter inference summary for the final model
est.summary()
Example Output (click to expand)
# Get the effect inference summary, which includes the standard error, z test score, p value, and confidence interval given each sample X[i]
est.effect_inference(X_test).summary_frame(alpha=0.05, value=0, decimals=3)

image

# Get the population summary for the entire sample X
est.effect_inference(X_test).population_summary(alpha=0.1, value=0, decimals=3, tol=0.001)

image

#  Get the parameter inference summary for the final model
est.summary()

image

To see more complex examples, go to the notebooks section of the repository. For a more detailed description of the treatment effect estimation algorithms, see the EconML documentation.

For Developers

You can get started by cloning this repository. We use setuptools for building and distributing our package. We rely on some recent features of setuptools, so make sure to upgrade to a recent version with pip install setuptools --upgrade. Then from your local copy of the repository you can run python setup.py develop to get started.

Running the tests

This project uses pytest for testing. To run tests locally after installing the package, you can use python setup.py pytest.

Generating the documentation

This project's documentation is generated via Sphinx. Note that we use graphviz's dot application to produce some of the images in our documentation, so you should make sure that dot is installed and in your path.

To generate a local copy of the documentation from a clone of this repository, just run python setup.py build_sphinx -W -E -a, which will build the documentation and place it under the build/sphinx/html path.

The reStructuredText files that make up the documentation are stored in the docs directory; module documentation is automatically generated by the Sphinx build process.

Blogs and Publications

Citation

If you use EconML in your research, please cite us as follows:

Microsoft Research. EconML: A Python Package for ML-Based Heterogeneous Treatment Effects Estimation. https://github.com/microsoft/EconML, 2019. Version 0.x.

BibTex:

@misc{econml,
  author={Microsoft Research},
  title={{EconML}: {A Python Package for ML-Based Heterogeneous Treatment Effects Estimation}},
  howpublished={https://github.com/microsoft/EconML},
  note={Version 0.x},
  year={2019}
}

Contributing and Feedback

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

References

X Nie, S Wager. Quasi-Oracle Estimation of Heterogeneous Treatment Effects. Biometrika, 2020

V. Syrgkanis, V. Lei, M. Oprescu, M. Hei, K. Battocchi, G. Lewis. Machine Learning Estimation of Heterogeneous Treatment Effects with Instruments. Proceedings of the 33rd Conference on Neural Information Processing Systems (NeurIPS), 2019 (Spotlight Presentation)

D. Foster, V. Syrgkanis. Orthogonal Statistical Learning. Proceedings of the 32nd Annual Conference on Learning Theory (COLT), 2019 (Best Paper Award)

M. Oprescu, V. Syrgkanis and Z. S. Wu. Orthogonal Random Forest for Causal Inference. Proceedings of the 36th International Conference on Machine Learning (ICML), 2019.

S. Künzel, J. Sekhon, J. Bickel and B. Yu. Metalearners for estimating heterogeneous treatment effects using machine learning. Proceedings of the national academy of sciences, 116(10), 4156-4165, 2019.

S. Athey, J. Tibshirani, S. Wager. Generalized random forests. Annals of Statistics, 47, no. 2, 1148--1178, 2019.

V. Chernozhukov, D. Nekipelov, V. Semenova, V. Syrgkanis. Plug-in Regularized Estimation of High-Dimensional Parameters in Nonlinear Semiparametric Models. Arxiv preprint arxiv:1806.04823, 2018.

S. Wager, S. Athey. Estimation and Inference of Heterogeneous Treatment Effects using Random Forests. Journal of the American Statistical Association, 113:523, 1228-1242, 2018.

Jason Hartford, Greg Lewis, Kevin Leyton-Brown, and Matt Taddy. Deep IV: A flexible approach for counterfactual prediction. Proceedings of the 34th International Conference on Machine Learning, ICML'17, 2017.

V. Chernozhukov, D. Chetverikov, M. Demirer, E. Duflo, C. Hansen, and a. W. Newey. Double Machine Learning for Treatment and Causal Parameters. ArXiv preprint arXiv:1608.00060, 2016.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

econml-0.10.0.tar.gz (1.3 MB view details)

Uploaded Source

Built Distributions

econml-0.10.0-cp38-cp38-win_amd64.whl (887.3 kB view details)

Uploaded CPython 3.8 Windows x86-64

econml-0.10.0-cp38-cp38-win32.whl (787.0 kB view details)

Uploaded CPython 3.8 Windows x86

econml-0.10.0-cp38-cp38-manylinux2010_x86_64.whl (3.3 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ x86-64

econml-0.10.0-cp38-cp38-manylinux2010_i686.whl (3.1 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ i686

econml-0.10.0-cp38-cp38-manylinux1_x86_64.whl (3.3 MB view details)

Uploaded CPython 3.8

econml-0.10.0-cp38-cp38-manylinux1_i686.whl (3.1 MB view details)

Uploaded CPython 3.8

econml-0.10.0-cp38-cp38-macosx_10_9_x86_64.whl (895.7 kB view details)

Uploaded CPython 3.8 macOS 10.9+ x86-64

econml-0.10.0-cp37-cp37m-win_amd64.whl (878.1 kB view details)

Uploaded CPython 3.7m Windows x86-64

econml-0.10.0-cp37-cp37m-win32.whl (777.9 kB view details)

Uploaded CPython 3.7m Windows x86

econml-0.10.0-cp37-cp37m-manylinux2010_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ x86-64

econml-0.10.0-cp37-cp37m-manylinux2010_i686.whl (2.8 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ i686

econml-0.10.0-cp37-cp37m-manylinux1_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.7m

econml-0.10.0-cp37-cp37m-manylinux1_i686.whl (2.8 MB view details)

Uploaded CPython 3.7m

econml-0.10.0-cp37-cp37m-macosx_10_9_x86_64.whl (897.2 kB view details)

Uploaded CPython 3.7m macOS 10.9+ x86-64

econml-0.10.0-cp36-cp36m-win_amd64.whl (877.5 kB view details)

Uploaded CPython 3.6m Windows x86-64

econml-0.10.0-cp36-cp36m-win32.whl (777.7 kB view details)

Uploaded CPython 3.6m Windows x86

econml-0.10.0-cp36-cp36m-manylinux2010_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.6m manylinux: glibc 2.12+ x86-64

econml-0.10.0-cp36-cp36m-manylinux2010_i686.whl (2.8 MB view details)

Uploaded CPython 3.6m manylinux: glibc 2.12+ i686

econml-0.10.0-cp36-cp36m-manylinux1_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.6m

econml-0.10.0-cp36-cp36m-manylinux1_i686.whl (2.8 MB view details)

Uploaded CPython 3.6m

econml-0.10.0-cp36-cp36m-macosx_10_9_x86_64.whl (900.5 kB view details)

Uploaded CPython 3.6m macOS 10.9+ x86-64

File details

Details for the file econml-0.10.0.tar.gz.

File metadata

  • Download URL: econml-0.10.0.tar.gz
  • Upload date:
  • Size: 1.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0.tar.gz
Algorithm Hash digest
SHA256 66542474b2674774cd708b453241495b052307fa162328a5b0c640fd6465bf7e
MD5 8840eccec91d5ac6b9c7b14178d414fe
BLAKE2b-256 9cf77a22e92497d8900fc62e73bdebffea6d1fca3b13de022192016d252b4412

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: econml-0.10.0-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 887.3 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 3c5aeb574799b94809f3a4170875c82649b2bf20ce1c8c05a897af8f2239e068
MD5 ff8e5a6009e285ad93dde51fe8c1cab6
BLAKE2b-256 9aaf8e4c623e62c3899466d81da790c5c9ec608b103398ba5f26c3633c208c21

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp38-cp38-win32.whl.

File metadata

  • Download URL: econml-0.10.0-cp38-cp38-win32.whl
  • Upload date:
  • Size: 787.0 kB
  • Tags: CPython 3.8, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp38-cp38-win32.whl
Algorithm Hash digest
SHA256 e52e2ce9f96475ac10d0631c8bee6a89f1d33b49fcca4d5ef66bf4e1b5e80227
MD5 e7874f0cf70497671d28aa03d50c1ae0
BLAKE2b-256 e3866165ba6080d1e8aa4d66714c6317e985d6db98854b43c37262ab131fdd94

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp38-cp38-manylinux2010_x86_64.whl.

File metadata

  • Download URL: econml-0.10.0-cp38-cp38-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.8, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp38-cp38-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 3fb8842af934dd310cf08b0bef529a767e0032f2e2d1b4e0fde59a86593f8b47
MD5 8e864e5df8b00524d404878a4f2b8e04
BLAKE2b-256 521048c614cff3c22c489e4fda20611e7657faa9317609db1e4a5a6b63609c85

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp38-cp38-manylinux2010_i686.whl.

File metadata

  • Download URL: econml-0.10.0-cp38-cp38-manylinux2010_i686.whl
  • Upload date:
  • Size: 3.1 MB
  • Tags: CPython 3.8, manylinux: glibc 2.12+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp38-cp38-manylinux2010_i686.whl
Algorithm Hash digest
SHA256 c56c59022a186a11da11d4f08afe27502664efc5c8aa5ccab8dbca5fbfd8dc65
MD5 57ba6d1a2d80e2a6894c828a7e9f4790
BLAKE2b-256 d978af32cc8dd0dc59e2765da5e0e6ef7951f44a56c4ef45a38a926eb6b38384

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp38-cp38-manylinux1_x86_64.whl.

File metadata

  • Download URL: econml-0.10.0-cp38-cp38-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp38-cp38-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 d39124b92d29532a92da435b2efff1368a4b7af2ed5187a667bf25db6b036fe5
MD5 8bcc46889c3c2afdfea9630c07511317
BLAKE2b-256 b2bea8f38321b0eca79ebc449b5518e27b9afbdf13d1c239885be45b2759723f

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp38-cp38-manylinux1_i686.whl.

File metadata

  • Download URL: econml-0.10.0-cp38-cp38-manylinux1_i686.whl
  • Upload date:
  • Size: 3.1 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp38-cp38-manylinux1_i686.whl
Algorithm Hash digest
SHA256 773319c524fed3133315ecd985ec9e4154612488ed90a9bd694869f158b9c864
MD5 34330841895b7bbec383a5c0bc6e73e6
BLAKE2b-256 7aa60bfb639742a4669eae7fd0d28f6fa101b27c94f3012dfb74792d3391095b

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: econml-0.10.0-cp38-cp38-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 895.7 kB
  • Tags: CPython 3.8, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 4c8c4fc0496b820a9a4b2bf37d95e54915ac57badc7456692aa80933e4348d4e
MD5 7dc5e7369e6ab0299b6df9905ced14b7
BLAKE2b-256 ce4ebd0094c5d65226dea365a3639be1016dc6c13176371b96162770984987b7

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: econml-0.10.0-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 878.1 kB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 28f74e780761ee029c107eff099b540ee0f1bb0f735ed08a8fed2c935e65c373
MD5 cc5782393a7c7fb450d1c6d3f3339b6b
BLAKE2b-256 e9e43ac5cc47bc248a408e5b5339e4f730840ea1d0febdb2ae6c5872fd1eac60

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp37-cp37m-win32.whl.

File metadata

  • Download URL: econml-0.10.0-cp37-cp37m-win32.whl
  • Upload date:
  • Size: 777.9 kB
  • Tags: CPython 3.7m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp37-cp37m-win32.whl
Algorithm Hash digest
SHA256 eaf97a3735306210164b23ea41739b76cabcb559a4c980feeea9bf3f52e8b1e8
MD5 320ceb1ef4c1298b48ca6fe476c7ddd5
BLAKE2b-256 efd6c944da31f71b6e017c15a24965cd71ce5ac8b8f739feac1e61cc855798d6

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp37-cp37m-manylinux2010_x86_64.whl.

File metadata

  • Download URL: econml-0.10.0-cp37-cp37m-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.7m, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp37-cp37m-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 0196da5e246691b24c7191b448d1ada1fbacec9bd4a839b1f24d82e774611e33
MD5 1af79be724f38afb766119edfecd756c
BLAKE2b-256 fe4da94c40f3af6fabc3160d4be7378913dcb652fe8a930e3f9ae523934aab2f

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp37-cp37m-manylinux2010_i686.whl.

File metadata

  • Download URL: econml-0.10.0-cp37-cp37m-manylinux2010_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.7m, manylinux: glibc 2.12+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp37-cp37m-manylinux2010_i686.whl
Algorithm Hash digest
SHA256 083aa1c92fe42124bc7f6231127557d99f0fe8663090362a537befb092901ea8
MD5 8e8dd96f28eb00b72cdad942996cf989
BLAKE2b-256 6061082deb844b816cb7fa5b4cfdc1bd68eebd73042401ed03ea4d605b8b1ec0

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: econml-0.10.0-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 c465e39510407265753a880623eb4dff4263f61d58ce80178249ec82bd33e1d5
MD5 aac00e63e4f883883188881c5c29de29
BLAKE2b-256 99d1dee7ecd4c790065ff05106d6b5222d70494835061ab1add5df385f47d41b

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp37-cp37m-manylinux1_i686.whl.

File metadata

  • Download URL: econml-0.10.0-cp37-cp37m-manylinux1_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp37-cp37m-manylinux1_i686.whl
Algorithm Hash digest
SHA256 0e0c759e4657051e373b8be524bf29eb9a56e4b993f124c6af74f00eb2f8a66d
MD5 910fc547d7b2589a633accdf2eff199f
BLAKE2b-256 cc3a15c1a1829d5737647e4667694d296a674f1081fbc4f1b4dfe39fac632fa3

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp37-cp37m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: econml-0.10.0-cp37-cp37m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 897.2 kB
  • Tags: CPython 3.7m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp37-cp37m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 3e7d08a2b8b84036cb457c7646c2e30b15de9bee54d0015b712ff7eb2006f602
MD5 71626fe9e2a1dc0e0ccd9b289e995cb5
BLAKE2b-256 4618bda38b5dde00800ef436a05b6af40757bd31f5f477255fccfde1bb09074e

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: econml-0.10.0-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 877.5 kB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 0591b5d8dda90332dee333c1a3d785abfc59caa3e7c1896082b5d2be339b82df
MD5 eda60d2b91661fc031ca083269fd8e2a
BLAKE2b-256 e4a033b7357281f2ad74be292f83fb963d9cbc0d4100bef0bd9cefc7cd9dcb21

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp36-cp36m-win32.whl.

File metadata

  • Download URL: econml-0.10.0-cp36-cp36m-win32.whl
  • Upload date:
  • Size: 777.7 kB
  • Tags: CPython 3.6m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp36-cp36m-win32.whl
Algorithm Hash digest
SHA256 167c819f8094ab8750fa01434c7bc635cef767b7c3b17326d8fd374cf4ff99a5
MD5 09c8e5dc4237d4b82a9020be02572b7f
BLAKE2b-256 82753a318053144a0a0d3d223b7667ac9564810b6c6d381cdea60df51ebac6b9

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp36-cp36m-manylinux2010_x86_64.whl.

File metadata

  • Download URL: econml-0.10.0-cp36-cp36m-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.6m, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp36-cp36m-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 6e117baf29a24651bee0b3554c2f592e0131acd730f6ff0f5cb513581d5cf60e
MD5 f5bf5ac4f61c81f5d29d95f9542770b8
BLAKE2b-256 e6d5dbb4aa5da6b0dcfacefbdd6987f6f3368632e957b73761bf3515956f0f00

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp36-cp36m-manylinux2010_i686.whl.

File metadata

  • Download URL: econml-0.10.0-cp36-cp36m-manylinux2010_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.6m, manylinux: glibc 2.12+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp36-cp36m-manylinux2010_i686.whl
Algorithm Hash digest
SHA256 47bfe50ef9e1f2ac1e89aa56b3d9efde60d13e31e9854758515358444d506998
MD5 0cb28a6ab8f06f669da9def45ab35602
BLAKE2b-256 34303f3c4832bc7fc1a28824fa5af7da8d3319920b934b0329638086f14ce744

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: econml-0.10.0-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 e4aa6b0246731c5d4aca2b27dd35004f00c76ca881a5e9f12bd245a65c1740cf
MD5 8536634efd2a953a5cec65170a67790a
BLAKE2b-256 390654f53d3e191b6ae4740c0d7d150253fe236bae4bf80fc446abda7393d1f4

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp36-cp36m-manylinux1_i686.whl.

File metadata

  • Download URL: econml-0.10.0-cp36-cp36m-manylinux1_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp36-cp36m-manylinux1_i686.whl
Algorithm Hash digest
SHA256 231ec7f95f4709d7564dd56706f7c5fc17b04a80a7a96acbab7fe7888a1974cd
MD5 ae445cfaa07940777dc12996dd854b85
BLAKE2b-256 257b89cee5f34a4f7bb4254aa2e00130acca7c1d2da49557961e611af3ab57f2

See more details on using hashes here.

File details

Details for the file econml-0.10.0-cp36-cp36m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: econml-0.10.0-cp36-cp36m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 900.5 kB
  • Tags: CPython 3.6m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.10.0-cp36-cp36m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 9e7a490ea2ef28ab5778b79fbffe98047f9c10916003527086e040ac7abb44c9
MD5 159c98a3a3c5d678f4116ccbd39e7934
BLAKE2b-256 c118c543352c7c5e9e8610ba5b45bb36522e4aa9c1f09af0d83c6e6986c95e14

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page