Skip to main content

This package contains several methods for calculating Conditional Average Treatment Effects

Project description

Build Status PyPI version PyPI wheel Supported Python versions

EconML: A Python Package for ML-Based Heterogeneous Treatment Effects Estimation

EconML is a Python package for estimating heterogeneous treatment effects from observational data via machine learning. This package was designed and built as part of the ALICE project at Microsoft Research with the goal to combine state-of-the-art machine learning techniques with econometrics to bring automation to complex causal inference problems. The promise of EconML:

  • Implement recent techniques in the literature at the intersection of econometrics and machine learning
  • Maintain flexibility in modeling the effect heterogeneity (via techniques such as random forests, boosting, lasso and neural nets), while preserving the causal interpretation of the learned model and often offering valid confidence intervals
  • Use a unified API
  • Build on standard Python packages for Machine Learning and Data Analysis

One of the biggest promises of machine learning is to automate decision making in a multitude of domains. At the core of many data-driven personalized decision scenarios is the estimation of heterogeneous treatment effects: what is the causal effect of an intervention on an outcome of interest for a sample with a particular set of features? In a nutshell, this toolkit is designed to measure the causal effect of some treatment variable(s) T on an outcome variable Y, controlling for a set of features X, W and how does that effect vary as a function of X. The methods implemented are applicable even with observational (non-experimental or historical) datasets. For the estimation results to have a causal interpretation, some methods assume no unobserved confounders (i.e. there is no unobserved variable not included in X, W that simultaneously has an effect on both T and Y), while others assume access to an instrument Z (i.e. an observed variable Z that has an effect on the treatment T but no direct effect on the outcome Y). Most methods provide confidence intervals and inference results.

For detailed information about the package, consult the documentation at https://econml.azurewebsites.net/.

For information on use cases and background material on causal inference and heterogeneous treatment effects see our webpage at https://www.microsoft.com/en-us/research/project/econml/

Table of Contents

News

June 25, 2021: Release v0.12.0b3, see release notes here

Previous releases

June 18, 2021: Release v0.12.0b2, see release notes here

June 7, 2021: Release v0.12.0b1, see release notes here

May 18, 2021: Release v0.11.1, see release notes here

May 8, 2021: Release v0.11.0, see release notes here

March 22, 2021: Release v0.10.0, see release notes here

March 11, 2021: Release v0.9.2, see release notes here

March 3, 2021: Release v0.9.1, see release notes here

February 20, 2021: Release v0.9.0, see release notes here

January 20, 2021: Release v0.9.0b1, see release notes here

November 20, 2020: Release v0.8.1, see release notes here

November 18, 2020: Release v0.8.0, see release notes here

September 4, 2020: Release v0.8.0b1, see release notes here

March 6, 2020: Release v0.7.0, see release notes here

February 18, 2020: Release v0.7.0b1, see release notes here

January 10, 2020: Release v0.6.1, see release notes here

December 6, 2019: Release v0.6, see release notes here

November 21, 2019: Release v0.5, see release notes here.

June 3, 2019: Release v0.4, see release notes here.

May 3, 2019: Release v0.3, see release notes here.

April 10, 2019: Release v0.2, see release notes here.

March 6, 2019: Release v0.1, welcome to have a try and provide feedback.

Getting Started

Installation

Install the latest release from PyPI:

pip install econml

To install from source, see For Developers section below.

Usage Examples

Estimation Methods

Double Machine Learning (aka RLearner) (click to expand)
  • Linear final stage
from econml.dml import LinearDML
from sklearn.linear_model import LassoCV
from econml.inference import BootstrapInference

est = LinearDML(model_y=LassoCV(), model_t=LassoCV())
### Estimate with OLS confidence intervals
est.fit(Y, T, X=X, W=W) # W -> high-dimensional confounders, X -> features
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05) # OLS confidence intervals

### Estimate with bootstrap confidence intervals
est.fit(Y, T, X=X, W=W, inference='bootstrap')  # with default bootstrap parameters
est.fit(Y, T, X=X, W=W, inference=BootstrapInference(n_bootstrap_samples=100))  # or customized
lb, ub = est.effect_interval(X_test, alpha=0.05) # Bootstrap confidence intervals
  • Sparse linear final stage
from econml.dml import SparseLinearDML
from sklearn.linear_model import LassoCV

est = SparseLinearDML(model_y=LassoCV(), model_t=LassoCV())
est.fit(Y, T, X=X, W=W) # X -> high dimensional features
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05) # Confidence intervals via debiased lasso
  • Generic Machine Learning last stage
from econml.dml import NonParamDML
from sklearn.ensemble import RandomForestRegressor, RandomForestClassifier

est = NonParamDML(model_y=RandomForestRegressor(),
                  model_t=RandomForestClassifier(),
                  model_final=RandomForestRegressor(),
                  discrete_treatment=True)
est.fit(Y, T, X=X, W=W) 
treatment_effects = est.effect(X_test)
Causal Forests (click to expand)
from econml.dml import CausalForestDML
from sklearn.linear_model import LassoCV
# Use defaults
est = CausalForestDML()
# Or specify hyperparameters
est = CausalForestDML(criterion='het', n_estimators=500,       
                      min_samples_leaf=10, 
                      max_depth=10, max_samples=0.5,
                      discrete_treatment=False,
                      model_t=LassoCV(), model_y=LassoCV())
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
# Confidence intervals via Bootstrap-of-Little-Bags for forests
lb, ub = est.effect_interval(X_test, alpha=0.05)
Orthogonal Random Forests (click to expand)
from econml.orf import DMLOrthoForest, DROrthoForest
from econml.sklearn_extensions.linear_model import WeightedLasso, WeightedLassoCV
# Use defaults
est = DMLOrthoForest()
est = DROrthoForest()
# Or specify hyperparameters
est = DMLOrthoForest(n_trees=500, min_leaf_size=10,
                     max_depth=10, subsample_ratio=0.7,
                     lambda_reg=0.01,
                     discrete_treatment=False,
                     model_T=WeightedLasso(alpha=0.01), model_Y=WeightedLasso(alpha=0.01),
                     model_T_final=WeightedLassoCV(cv=3), model_Y_final=WeightedLassoCV(cv=3))
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
# Confidence intervals via Bootstrap-of-Little-Bags for forests
lb, ub = est.effect_interval(X_test, alpha=0.05)
Meta-Learners (click to expand)
  • XLearner
from econml.metalearners import XLearner
from sklearn.ensemble import GradientBoostingClassifier, GradientBoostingRegressor

est = XLearner(models=GradientBoostingRegressor(),
              propensity_model=GradientBoostingClassifier(),
              cate_models=GradientBoostingRegressor())
est.fit(Y, T, X=np.hstack([X, W]))
treatment_effects = est.effect(np.hstack([X_test, W_test]))

# Fit with bootstrap confidence interval construction enabled
est.fit(Y, T, X=np.hstack([X, W]), inference='bootstrap')
treatment_effects = est.effect(np.hstack([X_test, W_test]))
lb, ub = est.effect_interval(np.hstack([X_test, W_test]), alpha=0.05) # Bootstrap CIs
  • SLearner
from econml.metalearners import SLearner
from sklearn.ensemble import GradientBoostingRegressor

est = SLearner(overall_model=GradientBoostingRegressor())
est.fit(Y, T, X=np.hstack([X, W]))
treatment_effects = est.effect(np.hstack([X_test, W_test]))
  • TLearner
from econml.metalearners import TLearner
from sklearn.ensemble import GradientBoostingRegressor

est = TLearner(models=GradientBoostingRegressor())
est.fit(Y, T, X=np.hstack([X, W]))
treatment_effects = est.effect(np.hstack([X_test, W_test]))
Doubly Robust Learners (click to expand)
  • Linear final stage
from econml.dr import LinearDRLearner
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier

est = LinearDRLearner(model_propensity=GradientBoostingClassifier(),
                      model_regression=GradientBoostingRegressor())
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05)
  • Sparse linear final stage
from econml.dr import SparseLinearDRLearner
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier

est = SparseLinearDRLearner(model_propensity=GradientBoostingClassifier(),
                            model_regression=GradientBoostingRegressor())
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05)
  • Nonparametric final stage
from econml.dr import ForestDRLearner
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier

est = ForestDRLearner(model_propensity=GradientBoostingClassifier(),
                      model_regression=GradientBoostingRegressor())
est.fit(Y, T, X=X, W=W) 
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05)
Orthogonal Instrumental Variables (click to expand)
  • Intent to Treat Doubly Robust Learner (discrete instrument, discrete treatment)
from econml.iv.dr import LinearIntentToTreatDRIV
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier
from sklearn.linear_model import LinearRegression

est = LinearIntentToTreatDRIV(model_Y_X=GradientBoostingRegressor(),
                              model_T_XZ=GradientBoostingClassifier(),
                              flexible_model_effect=GradientBoostingRegressor())
est.fit(Y, T, Z=Z, X=X) # OLS inference by default
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05) # OLS confidence intervals
Deep Instrumental Variables (click to expand)
import keras
from econml.iv.nnet import DeepIV

treatment_model = keras.Sequential([keras.layers.Dense(128, activation='relu', input_shape=(2,)),
                                    keras.layers.Dropout(0.17),
                                    keras.layers.Dense(64, activation='relu'),
                                    keras.layers.Dropout(0.17),
                                    keras.layers.Dense(32, activation='relu'),
                                    keras.layers.Dropout(0.17)])
response_model = keras.Sequential([keras.layers.Dense(128, activation='relu', input_shape=(2,)),
                                  keras.layers.Dropout(0.17),
                                  keras.layers.Dense(64, activation='relu'),
                                  keras.layers.Dropout(0.17),
                                  keras.layers.Dense(32, activation='relu'),
                                  keras.layers.Dropout(0.17),
                                  keras.layers.Dense(1)])
est = DeepIV(n_components=10, # Number of gaussians in the mixture density networks)
             m=lambda z, x: treatment_model(keras.layers.concatenate([z, x])), # Treatment model
             h=lambda t, x: response_model(keras.layers.concatenate([t, x])), # Response model
             n_samples=1 # Number of samples used to estimate the response
             )
est.fit(Y, T, X=X, Z=Z) # Z -> instrumental variables
treatment_effects = est.effect(X_test)

See the References section for more details.

Interpretability

Tree Interpreter of the CATE model (click to expand)
from econml.cate_interpreter import SingleTreeCateInterpreter
intrp = SingleTreeCateInterpreter(include_model_uncertainty=True, max_depth=2, min_samples_leaf=10)
# We interpret the CATE model's behavior based on the features used for heterogeneity
intrp.interpret(est, X)
# Plot the tree
plt.figure(figsize=(25, 5))
intrp.plot(feature_names=['A', 'B', 'C', 'D'], fontsize=12)
plt.show()

image

Policy Interpreter of the CATE model (click to expand)
from econml.cate_interpreter import SingleTreePolicyInterpreter
# We find a tree-based treatment policy based on the CATE model
intrp = SingleTreePolicyInterpreter(risk_level=0.05, max_depth=2, min_samples_leaf=1,min_impurity_decrease=.001)
intrp.interpret(est, X, sample_treatment_costs=0.2)
# Plot the tree
plt.figure(figsize=(25, 5))
intrp.plot(feature_names=['A', 'B', 'C', 'D'], fontsize=12)
plt.show()

image

SHAP values for the CATE model (click to expand)
import shap
from econml.dml import CausalForestDML
est = CausalForestDML()
est.fit(Y, T, X=X, W=W)
shap_values = est.shap_values(X)
shap.summary_plot(shap_values['Y0']['T0'])

Causal Model Selection and Cross-Validation

Causal model selection with the `RScorer` (click to expand)
from econml.score import Rscorer

# split data in train-validation
X_train, X_val, T_train, T_val, Y_train, Y_val = train_test_split(X, T, y, test_size=.4)

# define list of CATE estimators to select among
reg = lambda: RandomForestRegressor(min_samples_leaf=20)
clf = lambda: RandomForestClassifier(min_samples_leaf=20)
models = [('ldml', LinearDML(model_y=reg(), model_t=clf(), discrete_treatment=True,
                             linear_first_stages=False, cv=3)),
          ('xlearner', XLearner(models=reg(), cate_models=reg(), propensity_model=clf())),
          ('dalearner', DomainAdaptationLearner(models=reg(), final_models=reg(), propensity_model=clf())),
          ('slearner', SLearner(overall_model=reg())),
          ('drlearner', DRLearner(model_propensity=clf(), model_regression=reg(),
                                  model_final=reg(), cv=3)),
          ('rlearner', NonParamDML(model_y=reg(), model_t=clf(), model_final=reg(),
                                   discrete_treatment=True, cv=3)),
          ('dml3dlasso', DML(model_y=reg(), model_t=clf(),
                             model_final=LassoCV(cv=3, fit_intercept=False),
                             discrete_treatment=True,
                             featurizer=PolynomialFeatures(degree=3),
                             linear_first_stages=False, cv=3))
]

# fit cate models on train data
models = [(name, mdl.fit(Y_train, T_train, X=X_train)) for name, mdl in models]

# score cate models on validation data
scorer = RScorer(model_y=reg(), model_t=clf(),
                 discrete_treatment=True, cv=3, mc_iters=2, mc_agg='median')
scorer.fit(Y_val, T_val, X=X_val)
rscore = [scorer.score(mdl) for _, mdl in models]
# select the best model
mdl, _ = scorer.best_model([mdl for _, mdl in models])
# create weighted ensemble model based on score performance
mdl, _ = scorer.ensemble([mdl for _, mdl in models])
First Stage Model Selection (click to expand)

First stage models can be selected either by passing in cross-validated models (e.g. sklearn.linear_model.LassoCV) to EconML's estimators or perform the first stage model selection outside of EconML and pass in the selected model. Unless selecting among a large set of hyperparameters, choosing first stage models externally is the preferred method due to statistical and computational advantages.

from econml.dml import LinearDML
from sklearn import clone
from sklearn.ensemble import RandomForestRegressor
from sklearn.model_selection import GridSearchCV

cv_model = GridSearchCV(
              estimator=RandomForestRegressor(),
              param_grid={
                  "max_depth": [3, None],
                  "n_estimators": (10, 30, 50, 100, 200),
                  "max_features": (2, 4, 6),
              },
              cv=5,
           )
# First stage model selection within EconML
# This is more direct, but computationally and statistically less efficient
est = LinearDML(model_y=cv_model, model_t=cv_model)
# First stage model selection ouside of EconML
# This is the most efficient, but requires boilerplate code
model_t = clone(cv_model).fit(W, T).best_estimator_
model_y = clone(cv_model).fit(W, Y).best_estimator_
est = LinearDML(model_y=model_t, model_t=model_y)

Inference

Whenever inference is enabled, then one can get a more structure InferenceResults object with more elaborate inference information, such as p-values and z-statistics. When the CATE model is linear and parametric, then a summary() method is also enabled. For instance:

from econml.dml import LinearDML
# Use defaults
est = LinearDML()
est.fit(Y, T, X=X, W=W)
# Get the effect inference summary, which includes the standard error, z test score, p value, and confidence interval given each sample X[i]
est.effect_inference(X_test).summary_frame(alpha=0.05, value=0, decimals=3)
# Get the population summary for the entire sample X
est.effect_inference(X_test).population_summary(alpha=0.1, value=0, decimals=3, tol=0.001)
#  Get the parameter inference summary for the final model
est.summary()
Example Output (click to expand)
# Get the effect inference summary, which includes the standard error, z test score, p value, and confidence interval given each sample X[i]
est.effect_inference(X_test).summary_frame(alpha=0.05, value=0, decimals=3)

image

# Get the population summary for the entire sample X
est.effect_inference(X_test).population_summary(alpha=0.1, value=0, decimals=3, tol=0.001)

image

#  Get the parameter inference summary for the final model
est.summary()

image

Policy Learning

You can also perform direct policy learning from observational data, using the doubly robust method for offline policy learning. These methods directly predict a recommended treatment, without internally fitting an explicit model of the conditional average treatment effect.

Doubly Robust Policy Learning (click to expand)
from econml.policy import DRPolicyTree, DRPolicyForest
from sklearn.ensemble import RandomForestRegressor

# fit a single binary decision tree policy
policy = DRPolicyTree(max_depth=1, min_impurity_decrease=0.01, honest=True)
policy.fit(y, T, X=X, W=W)
# predict the recommended treatment
recommended_T = policy.predict(X)
# plot the binary decision tree
plt.figure(figsize=(10,5))
policy.plot()
# get feature importances
importances = policy.feature_importances_

# fit a binary decision forest
policy = DRPolicyForest(max_depth=1, min_impurity_decrease=0.01, honest=True)
policy.fit(y, T, X=X, W=W)
# predict the recommended treatment
recommended_T = policy.predict(X)
# plot the first tree in the ensemble
plt.figure(figsize=(10,5))
policy.plot(0)
# get feature importances
importances = policy.feature_importances_

image

To see more complex examples, go to the notebooks section of the repository. For a more detailed description of the treatment effect estimation algorithms, see the EconML documentation.

For Developers

You can get started by cloning this repository. We use setuptools for building and distributing our package. We rely on some recent features of setuptools, so make sure to upgrade to a recent version with pip install setuptools --upgrade. Then from your local copy of the repository you can run pip install -e . to get started (but depending on what you're doing you might want to install with extras instead, like pip install -e .[plt] if you want to use matplotlib integration, or you can use pip install -e .[all] to include all extras).

Running the tests

This project uses pytest for testing. To run tests locally after installing the package, you can use pip install pytest-runner followed by python setup.py pytest.

We have added pytest marks to some tests to make it easier to run a subset, and you can set the PYTEST_ADDOPTS environment variable to take advantage of this. For instance, you can set it to -m "not (notebook or automl)" to skip notebook and automl tests that have some additional dependencies.

Generating the documentation

This project's documentation is generated via Sphinx. Note that we use graphviz's dot application to produce some of the images in our documentation, so you should make sure that dot is installed and in your path.

To generate a local copy of the documentation from a clone of this repository, just run python setup.py build_sphinx -W -E -a, which will build the documentation and place it under the build/sphinx/html path.

The reStructuredText files that make up the documentation are stored in the docs directory; module documentation is automatically generated by the Sphinx build process.

Blogs and Publications

Citation

If you use EconML in your research, please cite us as follows:

Microsoft Research. EconML: A Python Package for ML-Based Heterogeneous Treatment Effects Estimation. https://github.com/microsoft/EconML, 2019. Version 0.x.

BibTex:

@misc{econml,
  author={Microsoft Research},
  title={{EconML}: {A Python Package for ML-Based Heterogeneous Treatment Effects Estimation}},
  howpublished={https://github.com/microsoft/EconML},
  note={Version 0.x},
  year={2019}
}

Contributing and Feedback

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

References

Athey, Susan, and Stefan Wager. Policy learning with observational data. Econometrica 89.1 (2021): 133-161.

X Nie, S Wager. Quasi-Oracle Estimation of Heterogeneous Treatment Effects. Biometrika, 2020

V. Syrgkanis, V. Lei, M. Oprescu, M. Hei, K. Battocchi, G. Lewis. Machine Learning Estimation of Heterogeneous Treatment Effects with Instruments. Proceedings of the 33rd Conference on Neural Information Processing Systems (NeurIPS), 2019 (Spotlight Presentation)

D. Foster, V. Syrgkanis. Orthogonal Statistical Learning. Proceedings of the 32nd Annual Conference on Learning Theory (COLT), 2019 (Best Paper Award)

M. Oprescu, V. Syrgkanis and Z. S. Wu. Orthogonal Random Forest for Causal Inference. Proceedings of the 36th International Conference on Machine Learning (ICML), 2019.

S. Künzel, J. Sekhon, J. Bickel and B. Yu. Metalearners for estimating heterogeneous treatment effects using machine learning. Proceedings of the national academy of sciences, 116(10), 4156-4165, 2019.

S. Athey, J. Tibshirani, S. Wager. Generalized random forests. Annals of Statistics, 47, no. 2, 1148--1178, 2019.

V. Chernozhukov, D. Nekipelov, V. Semenova, V. Syrgkanis. Plug-in Regularized Estimation of High-Dimensional Parameters in Nonlinear Semiparametric Models. Arxiv preprint arxiv:1806.04823, 2018.

S. Wager, S. Athey. Estimation and Inference of Heterogeneous Treatment Effects using Random Forests. Journal of the American Statistical Association, 113:523, 1228-1242, 2018.

Jason Hartford, Greg Lewis, Kevin Leyton-Brown, and Matt Taddy. Deep IV: A flexible approach for counterfactual prediction. Proceedings of the 34th International Conference on Machine Learning, ICML'17, 2017.

V. Chernozhukov, D. Chetverikov, M. Demirer, E. Duflo, C. Hansen, and a. W. Newey. Double Machine Learning for Treatment and Causal Parameters. ArXiv preprint arXiv:1608.00060, 2016.

Dudik, M., Erhan, D., Langford, J., & Li, L. Doubly robust policy evaluation and optimization. Statistical Science, 29(4), 485-511, 2014.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

econml-0.12.0b3.tar.gz (1.3 MB view details)

Uploaded Source

Built Distributions

econml-0.12.0b3-cp38-cp38-win_amd64.whl (908.9 kB view details)

Uploaded CPython 3.8 Windows x86-64

econml-0.12.0b3-cp38-cp38-win32.whl (807.4 kB view details)

Uploaded CPython 3.8 Windows x86

econml-0.12.0b3-cp38-cp38-manylinux2010_x86_64.whl (3.3 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ x86-64

econml-0.12.0b3-cp38-cp38-manylinux2010_i686.whl (3.1 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ i686

econml-0.12.0b3-cp38-cp38-manylinux1_x86_64.whl (3.3 MB view details)

Uploaded CPython 3.8

econml-0.12.0b3-cp38-cp38-manylinux1_i686.whl (3.1 MB view details)

Uploaded CPython 3.8

econml-0.12.0b3-cp38-cp38-macosx_10_9_x86_64.whl (916.1 kB view details)

Uploaded CPython 3.8 macOS 10.9+ x86-64

econml-0.12.0b3-cp37-cp37m-win_amd64.whl (899.8 kB view details)

Uploaded CPython 3.7m Windows x86-64

econml-0.12.0b3-cp37-cp37m-win32.whl (798.4 kB view details)

Uploaded CPython 3.7m Windows x86

econml-0.12.0b3-cp37-cp37m-manylinux2010_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ x86-64

econml-0.12.0b3-cp37-cp37m-manylinux2010_i686.whl (2.8 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ i686

econml-0.12.0b3-cp37-cp37m-manylinux1_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.7m

econml-0.12.0b3-cp37-cp37m-manylinux1_i686.whl (2.8 MB view details)

Uploaded CPython 3.7m

econml-0.12.0b3-cp37-cp37m-macosx_10_9_x86_64.whl (917.6 kB view details)

Uploaded CPython 3.7m macOS 10.9+ x86-64

econml-0.12.0b3-cp36-cp36m-win_amd64.whl (899.3 kB view details)

Uploaded CPython 3.6m Windows x86-64

econml-0.12.0b3-cp36-cp36m-win32.whl (798.2 kB view details)

Uploaded CPython 3.6m Windows x86

econml-0.12.0b3-cp36-cp36m-manylinux2010_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.6m manylinux: glibc 2.12+ x86-64

econml-0.12.0b3-cp36-cp36m-manylinux2010_i686.whl (2.8 MB view details)

Uploaded CPython 3.6m manylinux: glibc 2.12+ i686

econml-0.12.0b3-cp36-cp36m-manylinux1_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.6m

econml-0.12.0b3-cp36-cp36m-manylinux1_i686.whl (2.8 MB view details)

Uploaded CPython 3.6m

econml-0.12.0b3-cp36-cp36m-macosx_10_9_x86_64.whl (920.9 kB view details)

Uploaded CPython 3.6m macOS 10.9+ x86-64

File details

Details for the file econml-0.12.0b3.tar.gz.

File metadata

  • Download URL: econml-0.12.0b3.tar.gz
  • Upload date:
  • Size: 1.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3.tar.gz
Algorithm Hash digest
SHA256 47ba9b071f5eae70c8c98ae079be38dd388dc62fbdacbc0188e04d64ed0bc006
MD5 01a543d35a24a973db39dee82603579a
BLAKE2b-256 ef62d929433cff2d4c08a22f7d10010a763589d165acd6437364616c90d9d0aa

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 908.9 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 e4cc01c938a10afa0fd4d3ed8c12b1192dd1dfa40351d1a19b3e3ace4985f713
MD5 e4df2062f19ee3aba8f7285654936399
BLAKE2b-256 771a06bf5f360708c9df32d08ff37edb2dce4a03287c1b93b8bd41b1ca646102

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp38-cp38-win32.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp38-cp38-win32.whl
  • Upload date:
  • Size: 807.4 kB
  • Tags: CPython 3.8, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp38-cp38-win32.whl
Algorithm Hash digest
SHA256 82c027b6bdb9ea3794b7efad368abe47730349918f8d17dad50ed5c3ed5aba49
MD5 8837f9fd889b8221f78fab915bce53b4
BLAKE2b-256 cd6174881ff9a2cca5905bc9c20612c361ecf17c2dc1ae2a0cd58a688858e8ee

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp38-cp38-manylinux2010_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp38-cp38-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.8, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp38-cp38-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 2fef56d060a0fd74931835a8109b12795d7ec25bf255599a9bfdea9cc95d3801
MD5 2ae7a76af7f449ea1ec9ebc2b870af9d
BLAKE2b-256 e72ac3104f077a45fba95a3a72e610c08812d1a5ca5105a99808a47593f07ef6

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp38-cp38-manylinux2010_i686.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp38-cp38-manylinux2010_i686.whl
  • Upload date:
  • Size: 3.1 MB
  • Tags: CPython 3.8, manylinux: glibc 2.12+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp38-cp38-manylinux2010_i686.whl
Algorithm Hash digest
SHA256 769f0f5dc1a3b8333f03cafe9e55f71d1407f50c5b19b90e53cadae4f6864b1d
MD5 19b746fdec576ee4f390387523a406a1
BLAKE2b-256 84c5020defaea5fdb6013daec1221ac5a676ecfb21e85438330aad6a9d2120f8

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp38-cp38-manylinux1_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp38-cp38-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp38-cp38-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 8809b6937d29e0e4edbdb0f5e7d3e3d0f4a6ca88253862c876fa393d52a4c5f0
MD5 131fa7f0f4912d559c426cba395b6af1
BLAKE2b-256 3318bca2ba1837b386004a53949a107f6c173f99f306cc7eebd62ce21ccdb3f7

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp38-cp38-manylinux1_i686.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp38-cp38-manylinux1_i686.whl
  • Upload date:
  • Size: 3.1 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp38-cp38-manylinux1_i686.whl
Algorithm Hash digest
SHA256 8c13c744cdce0f5d7377b37e549a8538b5acb4a2345dde041e2c29100d068c17
MD5 79efc7c692914082a39195bc8ce01f5c
BLAKE2b-256 d02dedd3405a03ef45937d6ac8f51ca4529ad587640b58f5b09c0bdd38e95065

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp38-cp38-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 916.1 kB
  • Tags: CPython 3.8, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 b390dfe958e314ee94efc9615e4097164de45cd01470f3a280ad4f8b628f845b
MD5 f297111773d2f5bc3213354ba5f91448
BLAKE2b-256 9420c1f2f255c3156fcd7776fa2376a283cd0e795afc6e47ff7133586067fc04

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 899.8 kB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 276a99ae0c9cff35fc5f60ea2d5a3d88d73f9040ded7fa6c753c0b6d80b8694d
MD5 8e7c29f586dfbd0e34c799399bcf662a
BLAKE2b-256 227607221e97aae7d156a9f95d74bd7b8dc97527eaeb50be0d636b636e5090d0

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp37-cp37m-win32.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp37-cp37m-win32.whl
  • Upload date:
  • Size: 798.4 kB
  • Tags: CPython 3.7m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp37-cp37m-win32.whl
Algorithm Hash digest
SHA256 6606b19679081266b4fa9c9456168f5cb8c328424432801837e0b852d01b47c3
MD5 bf07e737f5d3234c45a60488d450f46b
BLAKE2b-256 3e49d343bfb45a7649caefe16f91a75d5bd3fad669bd6b552139eb3d8e40a908

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp37-cp37m-manylinux2010_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp37-cp37m-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.7m, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp37-cp37m-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 1e9304a44bdae60e6b776bfe9a4498ed825f6c65f9bd9309871148c07f525c01
MD5 1612722fe89cda827cfa2f4db2b68d09
BLAKE2b-256 e0ed86e070a3eea22898af03506a83a77dbba886a1c6d69f6b20d42aed0248f2

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp37-cp37m-manylinux2010_i686.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp37-cp37m-manylinux2010_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.7m, manylinux: glibc 2.12+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp37-cp37m-manylinux2010_i686.whl
Algorithm Hash digest
SHA256 84325ea222fd784dd696d566924dfa146d8f18fe06f47eecbb37bd57f9fe4be2
MD5 c03eff8897c1ed875e570e696514b596
BLAKE2b-256 091c9eeb6658f874f2889a43dba0016356eddf0b0c54a45a585ae3fa7aa3af94

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 34fa4886768e6481db6f7a701f4653568683dde773c4923ad48bad4164dec1be
MD5 be5a5de8de19a03be95f6e3c79263f6f
BLAKE2b-256 ed7c4356e6378dde54db672aa5d30deeb2442062d5ce8e52ca3dd4a03c418c05

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp37-cp37m-manylinux1_i686.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp37-cp37m-manylinux1_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp37-cp37m-manylinux1_i686.whl
Algorithm Hash digest
SHA256 18267bea4379622e5e5a9cf577cb9feaf0982b4c1596e8c80e18b8ea6c4e89cf
MD5 5b7f7f00dcbb4e6d4b3f89098802e7ce
BLAKE2b-256 8205822589016ffe26ad31248f290fe7d63eacbb926bd1c7403bb0e43dd00086

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp37-cp37m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp37-cp37m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 917.6 kB
  • Tags: CPython 3.7m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp37-cp37m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 1d445c6e126b9a73cda3409a5dbc5a743e05f513dd21724d461971540a07064e
MD5 c8b6fcbf0c467293906a406790099ce0
BLAKE2b-256 dfb5c3f2756bf6129075622608e199f902ea73faee0c62eaa260597b79049944

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 899.3 kB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 bc0559d82996b559978fea9988a7770eb4f5ad845d08fff2098b9b3c83383c9a
MD5 536c86fbc8b9a83548e9ebd1cc83e6ff
BLAKE2b-256 f572b0a0aa2467440ef0f76751992625ba6204d159913a8ff6be07db8230be24

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp36-cp36m-win32.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp36-cp36m-win32.whl
  • Upload date:
  • Size: 798.2 kB
  • Tags: CPython 3.6m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp36-cp36m-win32.whl
Algorithm Hash digest
SHA256 0ad9f8751705dfeba117cb391df4a28f15d3a32659274526d2c10abb52c86c40
MD5 4ac28eb5db949afbd635e9d8e4e93f09
BLAKE2b-256 3a34b0f84b0d5dce66543a0d9683f28f39231063650ee6ef7c41cc68535f2978

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp36-cp36m-manylinux2010_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp36-cp36m-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.6m, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp36-cp36m-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 b9e1ec0870649e3e2733c48407cdeede6cbd87f63a73cc4459510020db26cd95
MD5 5835a6a85169ed8336ec0897fa575a8a
BLAKE2b-256 f736369e040b91cee8f623c07b010b32d3ac22df340d6c342f9122dbae366a5c

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp36-cp36m-manylinux2010_i686.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp36-cp36m-manylinux2010_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.6m, manylinux: glibc 2.12+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp36-cp36m-manylinux2010_i686.whl
Algorithm Hash digest
SHA256 a03edfe67dbce0ef2606f8dc44b378d0e4af8fd09943841627140bd5c882d39c
MD5 aae6657e2e38116d5320a924590435e4
BLAKE2b-256 a8feaff3674f9abdc51016ec468ab08060d46921d5757f52d0f03b2657b1da4d

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 269183429f492a601d127defe87cdf351ebb9c7837567ee49ec58de45da64bad
MD5 8b2a42bbdab0a0c6b4b4b270ef2dc673
BLAKE2b-256 15ad0aba25486b479b9723550c444205ba3289d6ad8b7d326972cdc5caefcf4b

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp36-cp36m-manylinux1_i686.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp36-cp36m-manylinux1_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp36-cp36m-manylinux1_i686.whl
Algorithm Hash digest
SHA256 0b6be80157e88ff09dc28c0d74e80d9c8b23f2f9783b095994f7483b9bfb193a
MD5 5e2529378b1f1e48b099be9865f48b71
BLAKE2b-256 4d53be0ee66e9b95a6c8b825464cb7543578d177aac21de6522785a6a58af483

See more details on using hashes here.

File details

Details for the file econml-0.12.0b3-cp36-cp36m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b3-cp36-cp36m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 920.9 kB
  • Tags: CPython 3.6m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b3-cp36-cp36m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 e38ebaf66229b24ba3bdae33ad6d58062e3ea1f6643dd15d2c36cf4eda03e126
MD5 059164659cf1fed46f835245b2270d36
BLAKE2b-256 f0f6b5a1e6f94734720eae5468915743854a80ab4b254148d3fe40bd6878acbc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page