Skip to main content

This package contains several methods for calculating Conditional Average Treatment Effects

Project description

Build Status PyPI version PyPI wheel Supported Python versions

EconML: A Python Package for ML-Based Heterogeneous Treatment Effects Estimation

EconML is a Python package for estimating heterogeneous treatment effects from observational data via machine learning. This package was designed and built as part of the ALICE project at Microsoft Research with the goal to combine state-of-the-art machine learning techniques with econometrics to bring automation to complex causal inference problems. The promise of EconML:

  • Implement recent techniques in the literature at the intersection of econometrics and machine learning
  • Maintain flexibility in modeling the effect heterogeneity (via techniques such as random forests, boosting, lasso and neural nets), while preserving the causal interpretation of the learned model and often offering valid confidence intervals
  • Use a unified API
  • Build on standard Python packages for Machine Learning and Data Analysis

One of the biggest promises of machine learning is to automate decision making in a multitude of domains. At the core of many data-driven personalized decision scenarios is the estimation of heterogeneous treatment effects: what is the causal effect of an intervention on an outcome of interest for a sample with a particular set of features? In a nutshell, this toolkit is designed to measure the causal effect of some treatment variable(s) T on an outcome variable Y, controlling for a set of features X, W and how does that effect vary as a function of X. The methods implemented are applicable even with observational (non-experimental or historical) datasets. For the estimation results to have a causal interpretation, some methods assume no unobserved confounders (i.e. there is no unobserved variable not included in X, W that simultaneously has an effect on both T and Y), while others assume access to an instrument Z (i.e. an observed variable Z that has an effect on the treatment T but no direct effect on the outcome Y). Most methods provide confidence intervals and inference results.

For detailed information about the package, consult the documentation at https://econml.azurewebsites.net/.

For information on use cases and background material on causal inference and heterogeneous treatment effects see our webpage at https://www.microsoft.com/en-us/research/project/econml/

Table of Contents

News

May 18, 2021: Release v0.11.1, see release notes here

Previous releases

May 8, 2021: Release v0.11.0, see release notes here

March 22, 2021: Release v0.10.0, see release notes here

March 11, 2021: Release v0.9.2, see release notes here

March 3, 2021: Release v0.9.1, see release notes here

February 20, 2021: Release v0.9.0, see release notes here

January 20, 2021: Release v0.9.0b1, see release notes here

November 20, 2020: Release v0.8.1, see release notes here

November 18, 2020: Release v0.8.0, see release notes here

September 4, 2020: Release v0.8.0b1, see release notes here

March 6, 2020: Release v0.7.0, see release notes here

February 18, 2020: Release v0.7.0b1, see release notes here

January 10, 2020: Release v0.6.1, see release notes here

December 6, 2019: Release v0.6, see release notes here

November 21, 2019: Release v0.5, see release notes here.

June 3, 2019: Release v0.4, see release notes here.

May 3, 2019: Release v0.3, see release notes here.

April 10, 2019: Release v0.2, see release notes here.

March 6, 2019: Release v0.1, welcome to have a try and provide feedback.

Getting Started

Installation

Install the latest release from PyPI:

pip install econml

To install from source, see For Developers section below.

Usage Examples

Estimation Methods

Double Machine Learning (aka RLearner) (click to expand)
  • Linear final stage
from econml.dml import LinearDML
from sklearn.linear_model import LassoCV
from econml.inference import BootstrapInference

est = LinearDML(model_y=LassoCV(), model_t=LassoCV())
### Estimate with OLS confidence intervals
est.fit(Y, T, X=X, W=W) # W -> high-dimensional confounders, X -> features
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05) # OLS confidence intervals

### Estimate with bootstrap confidence intervals
est.fit(Y, T, X=X, W=W, inference='bootstrap')  # with default bootstrap parameters
est.fit(Y, T, X=X, W=W, inference=BootstrapInference(n_bootstrap_samples=100))  # or customized
lb, ub = est.effect_interval(X_test, alpha=0.05) # Bootstrap confidence intervals
  • Sparse linear final stage
from econml.dml import SparseLinearDML
from sklearn.linear_model import LassoCV

est = SparseLinearDML(model_y=LassoCV(), model_t=LassoCV())
est.fit(Y, T, X=X, W=W) # X -> high dimensional features
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05) # Confidence intervals via debiased lasso
  • Generic Machine Learning last stage
from econml.dml import NonParamDML
from sklearn.ensemble import RandomForestRegressor, RandomForestClassifier

est = NonParamDML(model_y=RandomForestRegressor(),
                  model_t=RandomForestClassifier(),
                  model_final=RandomForestRegressor(),
                  discrete_treatment=True)
est.fit(Y, T, X=X, W=W) 
treatment_effects = est.effect(X_test)
Causal Forests (click to expand)
from econml.dml import CausalForestDML
from sklearn.linear_model import LassoCV
# Use defaults
est = CausalForestDML()
# Or specify hyperparameters
est = CausalForestDML(criterion='het', n_estimators=500,       
                      min_samples_leaf=10, 
                      max_depth=10, max_samples=0.5,
                      discrete_treatment=False,
                      model_t=LassoCV(), model_y=LassoCV())
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
# Confidence intervals via Bootstrap-of-Little-Bags for forests
lb, ub = est.effect_interval(X_test, alpha=0.05)
Orthogonal Random Forests (click to expand)
from econml.orf import DMLOrthoForest, DROrthoForest
from econml.sklearn_extensions.linear_model import WeightedLasso, WeightedLassoCV
# Use defaults
est = DMLOrthoForest()
est = DROrthoForest()
# Or specify hyperparameters
est = DMLOrthoForest(n_trees=500, min_leaf_size=10,
                     max_depth=10, subsample_ratio=0.7,
                     lambda_reg=0.01,
                     discrete_treatment=False,
                     model_T=WeightedLasso(alpha=0.01), model_Y=WeightedLasso(alpha=0.01),
                     model_T_final=WeightedLassoCV(cv=3), model_Y_final=WeightedLassoCV(cv=3))
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
# Confidence intervals via Bootstrap-of-Little-Bags for forests
lb, ub = est.effect_interval(X_test, alpha=0.05)
Meta-Learners (click to expand)
  • XLearner
from econml.metalearners import XLearner
from sklearn.ensemble import GradientBoostingClassifier, GradientBoostingRegressor

est = XLearner(models=GradientBoostingRegressor(),
              propensity_model=GradientBoostingClassifier(),
              cate_models=GradientBoostingRegressor())
est.fit(Y, T, X=np.hstack([X, W]))
treatment_effects = est.effect(np.hstack([X_test, W_test]))

# Fit with bootstrap confidence interval construction enabled
est.fit(Y, T, X=np.hstack([X, W]), inference='bootstrap')
treatment_effects = est.effect(np.hstack([X_test, W_test]))
lb, ub = est.effect_interval(np.hstack([X_test, W_test]), alpha=0.05) # Bootstrap CIs
  • SLearner
from econml.metalearners import SLearner
from sklearn.ensemble import GradientBoostingRegressor

est = SLearner(overall_model=GradientBoostingRegressor())
est.fit(Y, T, X=np.hstack([X, W]))
treatment_effects = est.effect(np.hstack([X_test, W_test]))
  • TLearner
from econml.metalearners import TLearner
from sklearn.ensemble import GradientBoostingRegressor

est = TLearner(models=GradientBoostingRegressor())
est.fit(Y, T, X=np.hstack([X, W]))
treatment_effects = est.effect(np.hstack([X_test, W_test]))
Doubly Robust Learners (click to expand)
  • Linear final stage
from econml.dr import LinearDRLearner
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier

est = LinearDRLearner(model_propensity=GradientBoostingClassifier(),
                      model_regression=GradientBoostingRegressor())
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05)
  • Sparse linear final stage
from econml.dr import SparseLinearDRLearner
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier

est = SparseLinearDRLearner(model_propensity=GradientBoostingClassifier(),
                            model_regression=GradientBoostingRegressor())
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05)
  • Nonparametric final stage
from econml.dr import ForestDRLearner
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier

est = ForestDRLearner(model_propensity=GradientBoostingClassifier(),
                      model_regression=GradientBoostingRegressor())
est.fit(Y, T, X=X, W=W) 
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05)
Orthogonal Instrumental Variables (click to expand)
  • Intent to Treat Doubly Robust Learner (discrete instrument, discrete treatment)
from econml.iv.dr import LinearIntentToTreatDRIV
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier
from sklearn.linear_model import LinearRegression

est = LinearIntentToTreatDRIV(model_Y_X=GradientBoostingRegressor(),
                              model_T_XZ=GradientBoostingClassifier(),
                              flexible_model_effect=GradientBoostingRegressor())
est.fit(Y, T, Z=Z, X=X) # OLS inference by default
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05) # OLS confidence intervals
Deep Instrumental Variables (click to expand)
import keras
from econml.iv.nnet import DeepIV

treatment_model = keras.Sequential([keras.layers.Dense(128, activation='relu', input_shape=(2,)),
                                    keras.layers.Dropout(0.17),
                                    keras.layers.Dense(64, activation='relu'),
                                    keras.layers.Dropout(0.17),
                                    keras.layers.Dense(32, activation='relu'),
                                    keras.layers.Dropout(0.17)])
response_model = keras.Sequential([keras.layers.Dense(128, activation='relu', input_shape=(2,)),
                                  keras.layers.Dropout(0.17),
                                  keras.layers.Dense(64, activation='relu'),
                                  keras.layers.Dropout(0.17),
                                  keras.layers.Dense(32, activation='relu'),
                                  keras.layers.Dropout(0.17),
                                  keras.layers.Dense(1)])
est = DeepIV(n_components=10, # Number of gaussians in the mixture density networks)
             m=lambda z, x: treatment_model(keras.layers.concatenate([z, x])), # Treatment model
             h=lambda t, x: response_model(keras.layers.concatenate([t, x])), # Response model
             n_samples=1 # Number of samples used to estimate the response
             )
est.fit(Y, T, X=X, Z=Z) # Z -> instrumental variables
treatment_effects = est.effect(X_test)

See the References section for more details.

Interpretability

Tree Interpreter of the CATE model (click to expand)
from econml.cate_interpreter import SingleTreeCateInterpreter
intrp = SingleTreeCateInterpreter(include_model_uncertainty=True, max_depth=2, min_samples_leaf=10)
# We interpret the CATE model's behavior based on the features used for heterogeneity
intrp.interpret(est, X)
# Plot the tree
plt.figure(figsize=(25, 5))
intrp.plot(feature_names=['A', 'B', 'C', 'D'], fontsize=12)
plt.show()

image

Policy Interpreter of the CATE model (click to expand)
from econml.cate_interpreter import SingleTreePolicyInterpreter
# We find a tree-based treatment policy based on the CATE model
intrp = SingleTreePolicyInterpreter(risk_level=0.05, max_depth=2, min_samples_leaf=1,min_impurity_decrease=.001)
intrp.interpret(est, X, sample_treatment_costs=0.2)
# Plot the tree
plt.figure(figsize=(25, 5))
intrp.plot(feature_names=['A', 'B', 'C', 'D'], fontsize=12)
plt.show()

image

SHAP values for the CATE model (click to expand)
import shap
from econml.dml import CausalForestDML
est = CausalForestDML()
est.fit(Y, T, X=X, W=W)
shap_values = est.shap_values(X)
shap.summary_plot(shap_values['Y0']['T0'])

Causal Model Selection and Cross-Validation

Causal model selection with the `RScorer` (click to expand)
from econml.score import Rscorer

# split data in train-validation
X_train, X_val, T_train, T_val, Y_train, Y_val = train_test_split(X, T, y, test_size=.4)

# define list of CATE estimators to select among
reg = lambda: RandomForestRegressor(min_samples_leaf=20)
clf = lambda: RandomForestClassifier(min_samples_leaf=20)
models = [('ldml', LinearDML(model_y=reg(), model_t=clf(), discrete_treatment=True,
                             linear_first_stages=False, cv=3)),
          ('xlearner', XLearner(models=reg(), cate_models=reg(), propensity_model=clf())),
          ('dalearner', DomainAdaptationLearner(models=reg(), final_models=reg(), propensity_model=clf())),
          ('slearner', SLearner(overall_model=reg())),
          ('drlearner', DRLearner(model_propensity=clf(), model_regression=reg(),
                                  model_final=reg(), cv=3)),
          ('rlearner', NonParamDML(model_y=reg(), model_t=clf(), model_final=reg(),
                                   discrete_treatment=True, cv=3)),
          ('dml3dlasso', DML(model_y=reg(), model_t=clf(),
                             model_final=LassoCV(cv=3, fit_intercept=False),
                             discrete_treatment=True,
                             featurizer=PolynomialFeatures(degree=3),
                             linear_first_stages=False, cv=3))
]

# fit cate models on train data
models = [(name, mdl.fit(Y_train, T_train, X=X_train)) for name, mdl in models]

# score cate models on validation data
scorer = RScorer(model_y=reg(), model_t=clf(),
                 discrete_treatment=True, cv=3, mc_iters=2, mc_agg='median')
scorer.fit(Y_val, T_val, X=X_val)
rscore = [scorer.score(mdl) for _, mdl in models]
# select the best model
mdl, _ = scorer.best_model([mdl for _, mdl in models])
# create weighted ensemble model based on score performance
mdl, _ = scorer.ensemble([mdl for _, mdl in models])
First Stage Model Selection (click to expand)

First stage models can be selected either by passing in cross-validated models (e.g. sklearn.linear_model.LassoCV) to EconML's estimators or perform the first stage model selection outside of EconML and pass in the selected model. Unless selecting among a large set of hyperparameters, choosing first stage models externally is the preferred method due to statistical and computational advantages.

from econml.dml import LinearDML
from sklearn import clone
from sklearn.ensemble import RandomForestRegressor
from sklearn.model_selection import GridSearchCV

cv_model = GridSearchCV(
              estimator=RandomForestRegressor(),
              param_grid={
                  "max_depth": [3, None],
                  "n_estimators": (10, 30, 50, 100, 200),
                  "max_features": (2, 4, 6),
              },
              cv=5,
           )
# First stage model selection within EconML
# This is more direct, but computationally and statistically less efficient
est = LinearDML(model_y=cv_model, model_t=cv_model)
# First stage model selection ouside of EconML
# This is the most efficient, but requires boilerplate code
model_t = clone(cv_model).fit(W, T).best_estimator_
model_y = clone(cv_model).fit(W, Y).best_estimator_
est = LinearDML(model_y=model_t, model_t=model_y)

Inference

Whenever inference is enabled, then one can get a more structure InferenceResults object with more elaborate inference information, such as p-values and z-statistics. When the CATE model is linear and parametric, then a summary() method is also enabled. For instance:

from econml.dml import LinearDML
# Use defaults
est = LinearDML()
est.fit(Y, T, X=X, W=W)
# Get the effect inference summary, which includes the standard error, z test score, p value, and confidence interval given each sample X[i]
est.effect_inference(X_test).summary_frame(alpha=0.05, value=0, decimals=3)
# Get the population summary for the entire sample X
est.effect_inference(X_test).population_summary(alpha=0.1, value=0, decimals=3, tol=0.001)
#  Get the parameter inference summary for the final model
est.summary()
Example Output (click to expand)
# Get the effect inference summary, which includes the standard error, z test score, p value, and confidence interval given each sample X[i]
est.effect_inference(X_test).summary_frame(alpha=0.05, value=0, decimals=3)

image

# Get the population summary for the entire sample X
est.effect_inference(X_test).population_summary(alpha=0.1, value=0, decimals=3, tol=0.001)

image

#  Get the parameter inference summary for the final model
est.summary()

image

Policy Learning

You can also perform direct policy learning from observational data, using the doubly robust method for offline policy learning. These methods directly predict a recommended treatment, without internally fitting an explicit model of the conditional average treatment effect.

Doubly Robust Policy Learning (click to expand)
from econml.policy import DRPolicyTree, DRPolicyForest
from sklearn.ensemble import RandomForestRegressor

# fit a single binary decision tree policy
policy = DRPolicyTree(max_depth=1, min_impurity_decrease=0.01, honest=True)
policy.fit(y, T, X=X, W=W)
# predict the recommended treatment
recommended_T = policy.predict(X)
# plot the binary decision tree
plt.figure(figsize=(10,5))
policy.plot()
# get feature importances
importances = policy.feature_importances_

# fit a binary decision forest
policy = DRPolicyForest(max_depth=1, min_impurity_decrease=0.01, honest=True)
policy.fit(y, T, X=X, W=W)
# predict the recommended treatment
recommended_T = policy.predict(X)
# plot the first tree in the ensemble
plt.figure(figsize=(10,5))
policy.plot(0)
# get feature importances
importances = policy.feature_importances_

image

To see more complex examples, go to the notebooks section of the repository. For a more detailed description of the treatment effect estimation algorithms, see the EconML documentation.

For Developers

You can get started by cloning this repository. We use setuptools for building and distributing our package. We rely on some recent features of setuptools, so make sure to upgrade to a recent version with pip install setuptools --upgrade. Then from your local copy of the repository you can run pip install -e . to get started (but depending on what you're doing you might want to install with extras instead, like pip install -e .[plt] if you want to use matplotlib integration, or you can use pip install -e .[all] to include all extras).

Running the tests

This project uses pytest for testing. To run tests locally after installing the package, you can use pip install pytest-runner followed by python setup.py pytest.

We have added pytest marks to some tests to make it easier to run a subset, and you can set the PYTEST_ADDOPTS environment variable to take advantage of this. For instance, you can set it to -m "not (notebook or automl)" to skip notebook and automl tests that have some additional dependencies.

Generating the documentation

This project's documentation is generated via Sphinx. Note that we use graphviz's dot application to produce some of the images in our documentation, so you should make sure that dot is installed and in your path.

To generate a local copy of the documentation from a clone of this repository, just run python setup.py build_sphinx -W -E -a, which will build the documentation and place it under the build/sphinx/html path.

The reStructuredText files that make up the documentation are stored in the docs directory; module documentation is automatically generated by the Sphinx build process.

Blogs and Publications

Citation

If you use EconML in your research, please cite us as follows:

Microsoft Research. EconML: A Python Package for ML-Based Heterogeneous Treatment Effects Estimation. https://github.com/microsoft/EconML, 2019. Version 0.x.

BibTex:

@misc{econml,
  author={Microsoft Research},
  title={{EconML}: {A Python Package for ML-Based Heterogeneous Treatment Effects Estimation}},
  howpublished={https://github.com/microsoft/EconML},
  note={Version 0.x},
  year={2019}
}

Contributing and Feedback

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

References

Athey, Susan, and Stefan Wager. Policy learning with observational data. Econometrica 89.1 (2021): 133-161.

X Nie, S Wager. Quasi-Oracle Estimation of Heterogeneous Treatment Effects. Biometrika, 2020

V. Syrgkanis, V. Lei, M. Oprescu, M. Hei, K. Battocchi, G. Lewis. Machine Learning Estimation of Heterogeneous Treatment Effects with Instruments. Proceedings of the 33rd Conference on Neural Information Processing Systems (NeurIPS), 2019 (Spotlight Presentation)

D. Foster, V. Syrgkanis. Orthogonal Statistical Learning. Proceedings of the 32nd Annual Conference on Learning Theory (COLT), 2019 (Best Paper Award)

M. Oprescu, V. Syrgkanis and Z. S. Wu. Orthogonal Random Forest for Causal Inference. Proceedings of the 36th International Conference on Machine Learning (ICML), 2019.

S. Künzel, J. Sekhon, J. Bickel and B. Yu. Metalearners for estimating heterogeneous treatment effects using machine learning. Proceedings of the national academy of sciences, 116(10), 4156-4165, 2019.

S. Athey, J. Tibshirani, S. Wager. Generalized random forests. Annals of Statistics, 47, no. 2, 1148--1178, 2019.

V. Chernozhukov, D. Nekipelov, V. Semenova, V. Syrgkanis. Plug-in Regularized Estimation of High-Dimensional Parameters in Nonlinear Semiparametric Models. Arxiv preprint arxiv:1806.04823, 2018.

S. Wager, S. Athey. Estimation and Inference of Heterogeneous Treatment Effects using Random Forests. Journal of the American Statistical Association, 113:523, 1228-1242, 2018.

Jason Hartford, Greg Lewis, Kevin Leyton-Brown, and Matt Taddy. Deep IV: A flexible approach for counterfactual prediction. Proceedings of the 34th International Conference on Machine Learning, ICML'17, 2017.

V. Chernozhukov, D. Chetverikov, M. Demirer, E. Duflo, C. Hansen, and a. W. Newey. Double Machine Learning for Treatment and Causal Parameters. ArXiv preprint arXiv:1608.00060, 2016.

Dudik, M., Erhan, D., Langford, J., & Li, L. Doubly robust policy evaluation and optimization. Statistical Science, 29(4), 485-511, 2014.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

econml-0.11.1.tar.gz (1.3 MB view details)

Uploaded Source

Built Distributions

econml-0.11.1-cp38-cp38-win_amd64.whl (902.7 kB view details)

Uploaded CPython 3.8 Windows x86-64

econml-0.11.1-cp38-cp38-win32.whl (802.3 kB view details)

Uploaded CPython 3.8 Windows x86

econml-0.11.1-cp38-cp38-manylinux2010_x86_64.whl (3.3 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ x86-64

econml-0.11.1-cp38-cp38-manylinux2010_i686.whl (3.1 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ i686

econml-0.11.1-cp38-cp38-manylinux1_x86_64.whl (3.3 MB view details)

Uploaded CPython 3.8

econml-0.11.1-cp38-cp38-manylinux1_i686.whl (3.1 MB view details)

Uploaded CPython 3.8

econml-0.11.1-cp38-cp38-macosx_10_9_x86_64.whl (911.0 kB view details)

Uploaded CPython 3.8 macOS 10.9+ x86-64

econml-0.11.1-cp37-cp37m-win_amd64.whl (893.5 kB view details)

Uploaded CPython 3.7m Windows x86-64

econml-0.11.1-cp37-cp37m-win32.whl (793.3 kB view details)

Uploaded CPython 3.7m Windows x86

econml-0.11.1-cp37-cp37m-manylinux2010_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ x86-64

econml-0.11.1-cp37-cp37m-manylinux2010_i686.whl (2.8 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ i686

econml-0.11.1-cp37-cp37m-manylinux1_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.7m

econml-0.11.1-cp37-cp37m-manylinux1_i686.whl (2.8 MB view details)

Uploaded CPython 3.7m

econml-0.11.1-cp37-cp37m-macosx_10_9_x86_64.whl (912.5 kB view details)

Uploaded CPython 3.7m macOS 10.9+ x86-64

econml-0.11.1-cp36-cp36m-win_amd64.whl (892.9 kB view details)

Uploaded CPython 3.6m Windows x86-64

econml-0.11.1-cp36-cp36m-win32.whl (793.1 kB view details)

Uploaded CPython 3.6m Windows x86

econml-0.11.1-cp36-cp36m-manylinux2010_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.6m manylinux: glibc 2.12+ x86-64

econml-0.11.1-cp36-cp36m-manylinux2010_i686.whl (2.8 MB view details)

Uploaded CPython 3.6m manylinux: glibc 2.12+ i686

econml-0.11.1-cp36-cp36m-manylinux1_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.6m

econml-0.11.1-cp36-cp36m-manylinux1_i686.whl (2.8 MB view details)

Uploaded CPython 3.6m

econml-0.11.1-cp36-cp36m-macosx_10_9_x86_64.whl (915.7 kB view details)

Uploaded CPython 3.6m macOS 10.9+ x86-64

File details

Details for the file econml-0.11.1.tar.gz.

File metadata

  • Download URL: econml-0.11.1.tar.gz
  • Upload date:
  • Size: 1.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1.tar.gz
Algorithm Hash digest
SHA256 7bdc7d02e7521efe0ac662649e480fb52389a0c67495ff0b6cd6ad5c6fb0d914
MD5 3a74ecce0c904261e04f7b2469b386aa
BLAKE2b-256 a23e9b67d52746d78d5f22e4f4bd69b0d40e0fd76c93daba7bb9b87d465b7ec0

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: econml-0.11.1-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 902.7 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 cdcf0a415e8db980b2261ba678f4b1f8d02b092900d01390858974af070fbad0
MD5 14f6b5b139a550c63236f5bc9c996df7
BLAKE2b-256 996cd9e229c3d8354c3ccefcdd015546d3cb3ee39f311a13b7e6a5dba8b84615

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp38-cp38-win32.whl.

File metadata

  • Download URL: econml-0.11.1-cp38-cp38-win32.whl
  • Upload date:
  • Size: 802.3 kB
  • Tags: CPython 3.8, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp38-cp38-win32.whl
Algorithm Hash digest
SHA256 1a3aa4a45f5301a95a89b6e0b2f492a38a7a19a2dc5dbceca2a5644908e679f5
MD5 d538725a1bf15c3a48548843cf491327
BLAKE2b-256 73ed6cc6dffaf5f2a9c99bce254e9806df184fc14455ce97f050ed5ce0cde355

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp38-cp38-manylinux2010_x86_64.whl.

File metadata

  • Download URL: econml-0.11.1-cp38-cp38-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.8, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp38-cp38-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 acc82756183345138aa740a3628ad856131001f2cfefabbd08c4568ddaeaf21c
MD5 60fb9d4de8008047839342ccab93c52a
BLAKE2b-256 1c52a0b6b2cc4ed39686818083a897d16eb5992282f987f7434abbbf244ae2e4

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp38-cp38-manylinux2010_i686.whl.

File metadata

  • Download URL: econml-0.11.1-cp38-cp38-manylinux2010_i686.whl
  • Upload date:
  • Size: 3.1 MB
  • Tags: CPython 3.8, manylinux: glibc 2.12+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp38-cp38-manylinux2010_i686.whl
Algorithm Hash digest
SHA256 e844a6a186088a17b7a671a5d9148facbc53baf953ce0c4a08719c4f6c9a121f
MD5 ff6979bb9361ee187da0b084e65bfbcc
BLAKE2b-256 c7a003728d4771c443bbec5c559c07a31de8885b49a460085747b4243df2ebcd

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp38-cp38-manylinux1_x86_64.whl.

File metadata

  • Download URL: econml-0.11.1-cp38-cp38-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp38-cp38-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 76728c2efa7d700b5da9a7e59309ce58029be92628f9c1c1802531b4d26931f6
MD5 8bc8578ff46f581a94cf3b733b96c2af
BLAKE2b-256 b4103b88d6047a5bb99c7570f5d0ef742bfda1ab870ea700fff828c5b019ed54

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp38-cp38-manylinux1_i686.whl.

File metadata

  • Download URL: econml-0.11.1-cp38-cp38-manylinux1_i686.whl
  • Upload date:
  • Size: 3.1 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp38-cp38-manylinux1_i686.whl
Algorithm Hash digest
SHA256 2c41911741a8fcf253c164e793a3d711a8c9b6d1a29c8a3b97e42a3eb093801a
MD5 5723414e06cd204bdff725e18174dd68
BLAKE2b-256 fae195185fe59731a341b48a9715d26a935fae130eed3a99789721581649afdc

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: econml-0.11.1-cp38-cp38-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 911.0 kB
  • Tags: CPython 3.8, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 a8548189b69478aa007f312371ee02adc52727b9f34de3a1a7ca02bbef89b6c1
MD5 8f61423f3d82183eacad43fd14e4c0b2
BLAKE2b-256 eee77a3f45abaf9569008921371997ae906ad79a958276f3ee819d35a62e9b41

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: econml-0.11.1-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 893.5 kB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 4a693b2fbbc0d5ac67bf78467e6f0ccdb0d352bbd27c16eadf317e621b56c2dd
MD5 0deff4b9339236f8b70db6171eb7771a
BLAKE2b-256 578a9b1fc872e7d0f53b587dc0fecbfbbb4250ba39a548d4e902fe4669cdae76

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp37-cp37m-win32.whl.

File metadata

  • Download URL: econml-0.11.1-cp37-cp37m-win32.whl
  • Upload date:
  • Size: 793.3 kB
  • Tags: CPython 3.7m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp37-cp37m-win32.whl
Algorithm Hash digest
SHA256 e5dca26466590b87222988282523cb3420454985f88940c0d29bcfbe08207ccb
MD5 6fcf903dd0acf6e45db75b2b0cd300c6
BLAKE2b-256 311f24cb9af6487f1bbef26f6e5ced39846f4e5e15fb7f89f874e6de0248f48e

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp37-cp37m-manylinux2010_x86_64.whl.

File metadata

  • Download URL: econml-0.11.1-cp37-cp37m-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.7m, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp37-cp37m-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 2e78902286b9b92cf45a7b92036d606e247a44073429e0ac950032924c8a0f1b
MD5 08ceb6eb5d834252a8072e0909a01cd6
BLAKE2b-256 b9b396295abadb0c4cb6415cec332f34bbcf43ae4b513ac79dda8529fcc2143d

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp37-cp37m-manylinux2010_i686.whl.

File metadata

  • Download URL: econml-0.11.1-cp37-cp37m-manylinux2010_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.7m, manylinux: glibc 2.12+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp37-cp37m-manylinux2010_i686.whl
Algorithm Hash digest
SHA256 f15d2303b30f06ef97896b12ad6d0dd1d91cd07c761c10aa4c04065b09d666be
MD5 ec94f3a85493714b7dc31f68cc2fb509
BLAKE2b-256 4242a528700aec688ec1b6b8b805e9b9a55f5a19f296bdff1d0332d467ff8c03

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: econml-0.11.1-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 f0fea4db97c19ab59e7b3f8d93d43e2b74ea946d6a18810cf538a1adff603543
MD5 42a45a2167c18ef693da8269b8e68247
BLAKE2b-256 c172d0bffeb8ad3c34d1049531cbe6913abd0b36beb76a4e9ce0b2a99f787843

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp37-cp37m-manylinux1_i686.whl.

File metadata

  • Download URL: econml-0.11.1-cp37-cp37m-manylinux1_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp37-cp37m-manylinux1_i686.whl
Algorithm Hash digest
SHA256 52427f73634462d80432a802196152b5f2ba7bb4766f9c6f1f431d164fd79e7f
MD5 7a2755248439cdf273d898fac459b0bf
BLAKE2b-256 37086153e4c6f240690bd355babed2fd2514e10f638f1d3896e7468ebe2f9b57

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp37-cp37m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: econml-0.11.1-cp37-cp37m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 912.5 kB
  • Tags: CPython 3.7m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp37-cp37m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 0116498c349b11e967a23e6d641cc916f4fe406107700f7188213df2f3ae3ab0
MD5 20b666cb04b99ac40c665fb8e91cdd25
BLAKE2b-256 7e5ba8b784d30904a4b732b69320229f6e3ed9c4fbc5e5bc175286bd019f2a65

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: econml-0.11.1-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 892.9 kB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 8b64bcfff13377fd31fca2faa978cbb615c5b9e31d7b5a2af6ce959644b4df5a
MD5 60ad6464942e46a6baed0e8dfa52904c
BLAKE2b-256 b8e54e58870cd78ea8561fd14977240c706380c307447388709b555dd1197ab6

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp36-cp36m-win32.whl.

File metadata

  • Download URL: econml-0.11.1-cp36-cp36m-win32.whl
  • Upload date:
  • Size: 793.1 kB
  • Tags: CPython 3.6m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp36-cp36m-win32.whl
Algorithm Hash digest
SHA256 44f5b4851b0b371c29ca90841b285e7c3b5fd5d74b0617444f9e82aa13f5116e
MD5 238fd7c2851b5d2d84b4c1457496a714
BLAKE2b-256 ba0d53f79a93d68440b902b93ee4d646cd14de830272cb29307bc56cbb8b4ca3

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp36-cp36m-manylinux2010_x86_64.whl.

File metadata

  • Download URL: econml-0.11.1-cp36-cp36m-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.6m, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp36-cp36m-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 e307c3f2e9cfc75d9af0fefd0a0495dd4c3e5c96e228e49d6e181475d61ab8f4
MD5 47e0f91cbd4d0438127dad23f583eb75
BLAKE2b-256 55d130deeae9ab3d696ebb78e0bb10c4317d966cfe33a8f13463b9e9598f9add

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp36-cp36m-manylinux2010_i686.whl.

File metadata

  • Download URL: econml-0.11.1-cp36-cp36m-manylinux2010_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.6m, manylinux: glibc 2.12+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp36-cp36m-manylinux2010_i686.whl
Algorithm Hash digest
SHA256 c9aa9ad1cf41ad1ca32b3aaf382d40f10b4fffb0885baf8de3e56ae42842c0bb
MD5 b2ed33682960a01717e57a84795cbbd9
BLAKE2b-256 a857888ce7be9ac5aaa215c8009fd1148fb865dbab21875118cd2121524b4ca3

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: econml-0.11.1-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 b32bf942a6e7d4f7bc9dee00ff0e77ab2b3dd881f577c98232c569fb96d07fc6
MD5 fe2602d84a2d0831324ab267e8bb94b6
BLAKE2b-256 c0885e3581c68858edfc7c1f3f075ebed072d69cb63dfcbb4e0ea4a28b8f883d

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp36-cp36m-manylinux1_i686.whl.

File metadata

  • Download URL: econml-0.11.1-cp36-cp36m-manylinux1_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp36-cp36m-manylinux1_i686.whl
Algorithm Hash digest
SHA256 cafb7afc29b4611a279cde0b5cb82e12315c3ed701c210fbc107c721051faf45
MD5 bde0f2e36b13587224e7f0b02f92f215
BLAKE2b-256 19a75a435c4e959dcdef5f657da14f2aadad37ac49af3f206f8aa4bb94fd9edc

See more details on using hashes here.

File details

Details for the file econml-0.11.1-cp36-cp36m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: econml-0.11.1-cp36-cp36m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 915.7 kB
  • Tags: CPython 3.6m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.11.1-cp36-cp36m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 dd6aed46c7359224faa874ca0b60e39b9e7a742e866071df2bf10138c261acc6
MD5 697cdfb2a464a5a5f2a5168550f9d9d6
BLAKE2b-256 b71932afe8f1089d2e80bf8bcb55849936c2a8693789f7ba703171167bdf2920

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page