Skip to main content

This package contains several methods for calculating Conditional Average Treatment Effects

Project description

Build Status PyPI version PyPI wheel Supported Python versions

EconML: A Python Package for ML-Based Heterogeneous Treatment Effects Estimation

EconML is a Python package for estimating heterogeneous treatment effects from observational data via machine learning. This package was designed and built as part of the ALICE project at Microsoft Research with the goal to combine state-of-the-art machine learning techniques with econometrics to bring automation to complex causal inference problems. The promise of EconML:

  • Implement recent techniques in the literature at the intersection of econometrics and machine learning
  • Maintain flexibility in modeling the effect heterogeneity (via techniques such as random forests, boosting, lasso and neural nets), while preserving the causal interpretation of the learned model and often offering valid confidence intervals
  • Use a unified API
  • Build on standard Python packages for Machine Learning and Data Analysis

One of the biggest promises of machine learning is to automate decision making in a multitude of domains. At the core of many data-driven personalized decision scenarios is the estimation of heterogeneous treatment effects: what is the causal effect of an intervention on an outcome of interest for a sample with a particular set of features? In a nutshell, this toolkit is designed to measure the causal effect of some treatment variable(s) T on an outcome variable Y, controlling for a set of features X, W and how does that effect vary as a function of X. The methods implemented are applicable even with observational (non-experimental or historical) datasets. For the estimation results to have a causal interpretation, some methods assume no unobserved confounders (i.e. there is no unobserved variable not included in X, W that simultaneously has an effect on both T and Y), while others assume access to an instrument Z (i.e. an observed variable Z that has an effect on the treatment T but no direct effect on the outcome Y). Most methods provide confidence intervals and inference results.

For detailed information about the package, consult the documentation at https://econml.azurewebsites.net/.

For information on use cases and background material on causal inference and heterogeneous treatment effects see our webpage at https://www.microsoft.com/en-us/research/project/econml/

Table of Contents

News

June 7, 2021: Release v0.12.0b1, see release notes here

Previous releases

May 18, 2021: Release v0.11.1, see release notes here

May 8, 2021: Release v0.11.0, see release notes here

March 22, 2021: Release v0.10.0, see release notes here

March 11, 2021: Release v0.9.2, see release notes here

March 3, 2021: Release v0.9.1, see release notes here

February 20, 2021: Release v0.9.0, see release notes here

January 20, 2021: Release v0.9.0b1, see release notes here

November 20, 2020: Release v0.8.1, see release notes here

November 18, 2020: Release v0.8.0, see release notes here

September 4, 2020: Release v0.8.0b1, see release notes here

March 6, 2020: Release v0.7.0, see release notes here

February 18, 2020: Release v0.7.0b1, see release notes here

January 10, 2020: Release v0.6.1, see release notes here

December 6, 2019: Release v0.6, see release notes here

November 21, 2019: Release v0.5, see release notes here.

June 3, 2019: Release v0.4, see release notes here.

May 3, 2019: Release v0.3, see release notes here.

April 10, 2019: Release v0.2, see release notes here.

March 6, 2019: Release v0.1, welcome to have a try and provide feedback.

Getting Started

Installation

Install the latest release from PyPI:

pip install econml

To install from source, see For Developers section below.

Usage Examples

Estimation Methods

Double Machine Learning (aka RLearner) (click to expand)
  • Linear final stage
from econml.dml import LinearDML
from sklearn.linear_model import LassoCV
from econml.inference import BootstrapInference

est = LinearDML(model_y=LassoCV(), model_t=LassoCV())
### Estimate with OLS confidence intervals
est.fit(Y, T, X=X, W=W) # W -> high-dimensional confounders, X -> features
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05) # OLS confidence intervals

### Estimate with bootstrap confidence intervals
est.fit(Y, T, X=X, W=W, inference='bootstrap')  # with default bootstrap parameters
est.fit(Y, T, X=X, W=W, inference=BootstrapInference(n_bootstrap_samples=100))  # or customized
lb, ub = est.effect_interval(X_test, alpha=0.05) # Bootstrap confidence intervals
  • Sparse linear final stage
from econml.dml import SparseLinearDML
from sklearn.linear_model import LassoCV

est = SparseLinearDML(model_y=LassoCV(), model_t=LassoCV())
est.fit(Y, T, X=X, W=W) # X -> high dimensional features
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05) # Confidence intervals via debiased lasso
  • Generic Machine Learning last stage
from econml.dml import NonParamDML
from sklearn.ensemble import RandomForestRegressor, RandomForestClassifier

est = NonParamDML(model_y=RandomForestRegressor(),
                  model_t=RandomForestClassifier(),
                  model_final=RandomForestRegressor(),
                  discrete_treatment=True)
est.fit(Y, T, X=X, W=W) 
treatment_effects = est.effect(X_test)
Causal Forests (click to expand)
from econml.dml import CausalForestDML
from sklearn.linear_model import LassoCV
# Use defaults
est = CausalForestDML()
# Or specify hyperparameters
est = CausalForestDML(criterion='het', n_estimators=500,       
                      min_samples_leaf=10, 
                      max_depth=10, max_samples=0.5,
                      discrete_treatment=False,
                      model_t=LassoCV(), model_y=LassoCV())
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
# Confidence intervals via Bootstrap-of-Little-Bags for forests
lb, ub = est.effect_interval(X_test, alpha=0.05)
Orthogonal Random Forests (click to expand)
from econml.orf import DMLOrthoForest, DROrthoForest
from econml.sklearn_extensions.linear_model import WeightedLasso, WeightedLassoCV
# Use defaults
est = DMLOrthoForest()
est = DROrthoForest()
# Or specify hyperparameters
est = DMLOrthoForest(n_trees=500, min_leaf_size=10,
                     max_depth=10, subsample_ratio=0.7,
                     lambda_reg=0.01,
                     discrete_treatment=False,
                     model_T=WeightedLasso(alpha=0.01), model_Y=WeightedLasso(alpha=0.01),
                     model_T_final=WeightedLassoCV(cv=3), model_Y_final=WeightedLassoCV(cv=3))
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
# Confidence intervals via Bootstrap-of-Little-Bags for forests
lb, ub = est.effect_interval(X_test, alpha=0.05)
Meta-Learners (click to expand)
  • XLearner
from econml.metalearners import XLearner
from sklearn.ensemble import GradientBoostingClassifier, GradientBoostingRegressor

est = XLearner(models=GradientBoostingRegressor(),
              propensity_model=GradientBoostingClassifier(),
              cate_models=GradientBoostingRegressor())
est.fit(Y, T, X=np.hstack([X, W]))
treatment_effects = est.effect(np.hstack([X_test, W_test]))

# Fit with bootstrap confidence interval construction enabled
est.fit(Y, T, X=np.hstack([X, W]), inference='bootstrap')
treatment_effects = est.effect(np.hstack([X_test, W_test]))
lb, ub = est.effect_interval(np.hstack([X_test, W_test]), alpha=0.05) # Bootstrap CIs
  • SLearner
from econml.metalearners import SLearner
from sklearn.ensemble import GradientBoostingRegressor

est = SLearner(overall_model=GradientBoostingRegressor())
est.fit(Y, T, X=np.hstack([X, W]))
treatment_effects = est.effect(np.hstack([X_test, W_test]))
  • TLearner
from econml.metalearners import TLearner
from sklearn.ensemble import GradientBoostingRegressor

est = TLearner(models=GradientBoostingRegressor())
est.fit(Y, T, X=np.hstack([X, W]))
treatment_effects = est.effect(np.hstack([X_test, W_test]))
Doubly Robust Learners (click to expand)
  • Linear final stage
from econml.dr import LinearDRLearner
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier

est = LinearDRLearner(model_propensity=GradientBoostingClassifier(),
                      model_regression=GradientBoostingRegressor())
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05)
  • Sparse linear final stage
from econml.dr import SparseLinearDRLearner
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier

est = SparseLinearDRLearner(model_propensity=GradientBoostingClassifier(),
                            model_regression=GradientBoostingRegressor())
est.fit(Y, T, X=X, W=W)
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05)
  • Nonparametric final stage
from econml.dr import ForestDRLearner
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier

est = ForestDRLearner(model_propensity=GradientBoostingClassifier(),
                      model_regression=GradientBoostingRegressor())
est.fit(Y, T, X=X, W=W) 
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05)
Orthogonal Instrumental Variables (click to expand)
  • Intent to Treat Doubly Robust Learner (discrete instrument, discrete treatment)
from econml.iv.dr import LinearIntentToTreatDRIV
from sklearn.ensemble import GradientBoostingRegressor, GradientBoostingClassifier
from sklearn.linear_model import LinearRegression

est = LinearIntentToTreatDRIV(model_Y_X=GradientBoostingRegressor(),
                              model_T_XZ=GradientBoostingClassifier(),
                              flexible_model_effect=GradientBoostingRegressor())
est.fit(Y, T, Z=Z, X=X) # OLS inference by default
treatment_effects = est.effect(X_test)
lb, ub = est.effect_interval(X_test, alpha=0.05) # OLS confidence intervals
Deep Instrumental Variables (click to expand)
import keras
from econml.iv.nnet import DeepIV

treatment_model = keras.Sequential([keras.layers.Dense(128, activation='relu', input_shape=(2,)),
                                    keras.layers.Dropout(0.17),
                                    keras.layers.Dense(64, activation='relu'),
                                    keras.layers.Dropout(0.17),
                                    keras.layers.Dense(32, activation='relu'),
                                    keras.layers.Dropout(0.17)])
response_model = keras.Sequential([keras.layers.Dense(128, activation='relu', input_shape=(2,)),
                                  keras.layers.Dropout(0.17),
                                  keras.layers.Dense(64, activation='relu'),
                                  keras.layers.Dropout(0.17),
                                  keras.layers.Dense(32, activation='relu'),
                                  keras.layers.Dropout(0.17),
                                  keras.layers.Dense(1)])
est = DeepIV(n_components=10, # Number of gaussians in the mixture density networks)
             m=lambda z, x: treatment_model(keras.layers.concatenate([z, x])), # Treatment model
             h=lambda t, x: response_model(keras.layers.concatenate([t, x])), # Response model
             n_samples=1 # Number of samples used to estimate the response
             )
est.fit(Y, T, X=X, Z=Z) # Z -> instrumental variables
treatment_effects = est.effect(X_test)

See the References section for more details.

Interpretability

Tree Interpreter of the CATE model (click to expand)
from econml.cate_interpreter import SingleTreeCateInterpreter
intrp = SingleTreeCateInterpreter(include_model_uncertainty=True, max_depth=2, min_samples_leaf=10)
# We interpret the CATE model's behavior based on the features used for heterogeneity
intrp.interpret(est, X)
# Plot the tree
plt.figure(figsize=(25, 5))
intrp.plot(feature_names=['A', 'B', 'C', 'D'], fontsize=12)
plt.show()

image

Policy Interpreter of the CATE model (click to expand)
from econml.cate_interpreter import SingleTreePolicyInterpreter
# We find a tree-based treatment policy based on the CATE model
intrp = SingleTreePolicyInterpreter(risk_level=0.05, max_depth=2, min_samples_leaf=1,min_impurity_decrease=.001)
intrp.interpret(est, X, sample_treatment_costs=0.2)
# Plot the tree
plt.figure(figsize=(25, 5))
intrp.plot(feature_names=['A', 'B', 'C', 'D'], fontsize=12)
plt.show()

image

SHAP values for the CATE model (click to expand)
import shap
from econml.dml import CausalForestDML
est = CausalForestDML()
est.fit(Y, T, X=X, W=W)
shap_values = est.shap_values(X)
shap.summary_plot(shap_values['Y0']['T0'])

Causal Model Selection and Cross-Validation

Causal model selection with the `RScorer` (click to expand)
from econml.score import Rscorer

# split data in train-validation
X_train, X_val, T_train, T_val, Y_train, Y_val = train_test_split(X, T, y, test_size=.4)

# define list of CATE estimators to select among
reg = lambda: RandomForestRegressor(min_samples_leaf=20)
clf = lambda: RandomForestClassifier(min_samples_leaf=20)
models = [('ldml', LinearDML(model_y=reg(), model_t=clf(), discrete_treatment=True,
                             linear_first_stages=False, cv=3)),
          ('xlearner', XLearner(models=reg(), cate_models=reg(), propensity_model=clf())),
          ('dalearner', DomainAdaptationLearner(models=reg(), final_models=reg(), propensity_model=clf())),
          ('slearner', SLearner(overall_model=reg())),
          ('drlearner', DRLearner(model_propensity=clf(), model_regression=reg(),
                                  model_final=reg(), cv=3)),
          ('rlearner', NonParamDML(model_y=reg(), model_t=clf(), model_final=reg(),
                                   discrete_treatment=True, cv=3)),
          ('dml3dlasso', DML(model_y=reg(), model_t=clf(),
                             model_final=LassoCV(cv=3, fit_intercept=False),
                             discrete_treatment=True,
                             featurizer=PolynomialFeatures(degree=3),
                             linear_first_stages=False, cv=3))
]

# fit cate models on train data
models = [(name, mdl.fit(Y_train, T_train, X=X_train)) for name, mdl in models]

# score cate models on validation data
scorer = RScorer(model_y=reg(), model_t=clf(),
                 discrete_treatment=True, cv=3, mc_iters=2, mc_agg='median')
scorer.fit(Y_val, T_val, X=X_val)
rscore = [scorer.score(mdl) for _, mdl in models]
# select the best model
mdl, _ = scorer.best_model([mdl for _, mdl in models])
# create weighted ensemble model based on score performance
mdl, _ = scorer.ensemble([mdl for _, mdl in models])
First Stage Model Selection (click to expand)

First stage models can be selected either by passing in cross-validated models (e.g. sklearn.linear_model.LassoCV) to EconML's estimators or perform the first stage model selection outside of EconML and pass in the selected model. Unless selecting among a large set of hyperparameters, choosing first stage models externally is the preferred method due to statistical and computational advantages.

from econml.dml import LinearDML
from sklearn import clone
from sklearn.ensemble import RandomForestRegressor
from sklearn.model_selection import GridSearchCV

cv_model = GridSearchCV(
              estimator=RandomForestRegressor(),
              param_grid={
                  "max_depth": [3, None],
                  "n_estimators": (10, 30, 50, 100, 200),
                  "max_features": (2, 4, 6),
              },
              cv=5,
           )
# First stage model selection within EconML
# This is more direct, but computationally and statistically less efficient
est = LinearDML(model_y=cv_model, model_t=cv_model)
# First stage model selection ouside of EconML
# This is the most efficient, but requires boilerplate code
model_t = clone(cv_model).fit(W, T).best_estimator_
model_y = clone(cv_model).fit(W, Y).best_estimator_
est = LinearDML(model_y=model_t, model_t=model_y)

Inference

Whenever inference is enabled, then one can get a more structure InferenceResults object with more elaborate inference information, such as p-values and z-statistics. When the CATE model is linear and parametric, then a summary() method is also enabled. For instance:

from econml.dml import LinearDML
# Use defaults
est = LinearDML()
est.fit(Y, T, X=X, W=W)
# Get the effect inference summary, which includes the standard error, z test score, p value, and confidence interval given each sample X[i]
est.effect_inference(X_test).summary_frame(alpha=0.05, value=0, decimals=3)
# Get the population summary for the entire sample X
est.effect_inference(X_test).population_summary(alpha=0.1, value=0, decimals=3, tol=0.001)
#  Get the parameter inference summary for the final model
est.summary()
Example Output (click to expand)
# Get the effect inference summary, which includes the standard error, z test score, p value, and confidence interval given each sample X[i]
est.effect_inference(X_test).summary_frame(alpha=0.05, value=0, decimals=3)

image

# Get the population summary for the entire sample X
est.effect_inference(X_test).population_summary(alpha=0.1, value=0, decimals=3, tol=0.001)

image

#  Get the parameter inference summary for the final model
est.summary()

image

Policy Learning

You can also perform direct policy learning from observational data, using the doubly robust method for offline policy learning. These methods directly predict a recommended treatment, without internally fitting an explicit model of the conditional average treatment effect.

Doubly Robust Policy Learning (click to expand)
from econml.policy import DRPolicyTree, DRPolicyForest
from sklearn.ensemble import RandomForestRegressor

# fit a single binary decision tree policy
policy = DRPolicyTree(max_depth=1, min_impurity_decrease=0.01, honest=True)
policy.fit(y, T, X=X, W=W)
# predict the recommended treatment
recommended_T = policy.predict(X)
# plot the binary decision tree
plt.figure(figsize=(10,5))
policy.plot()
# get feature importances
importances = policy.feature_importances_

# fit a binary decision forest
policy = DRPolicyForest(max_depth=1, min_impurity_decrease=0.01, honest=True)
policy.fit(y, T, X=X, W=W)
# predict the recommended treatment
recommended_T = policy.predict(X)
# plot the first tree in the ensemble
plt.figure(figsize=(10,5))
policy.plot(0)
# get feature importances
importances = policy.feature_importances_

image

To see more complex examples, go to the notebooks section of the repository. For a more detailed description of the treatment effect estimation algorithms, see the EconML documentation.

For Developers

You can get started by cloning this repository. We use setuptools for building and distributing our package. We rely on some recent features of setuptools, so make sure to upgrade to a recent version with pip install setuptools --upgrade. Then from your local copy of the repository you can run pip install -e . to get started (but depending on what you're doing you might want to install with extras instead, like pip install -e .[plt] if you want to use matplotlib integration, or you can use pip install -e .[all] to include all extras).

Running the tests

This project uses pytest for testing. To run tests locally after installing the package, you can use pip install pytest-runner followed by python setup.py pytest.

We have added pytest marks to some tests to make it easier to run a subset, and you can set the PYTEST_ADDOPTS environment variable to take advantage of this. For instance, you can set it to -m "not (notebook or automl)" to skip notebook and automl tests that have some additional dependencies.

Generating the documentation

This project's documentation is generated via Sphinx. Note that we use graphviz's dot application to produce some of the images in our documentation, so you should make sure that dot is installed and in your path.

To generate a local copy of the documentation from a clone of this repository, just run python setup.py build_sphinx -W -E -a, which will build the documentation and place it under the build/sphinx/html path.

The reStructuredText files that make up the documentation are stored in the docs directory; module documentation is automatically generated by the Sphinx build process.

Blogs and Publications

Citation

If you use EconML in your research, please cite us as follows:

Microsoft Research. EconML: A Python Package for ML-Based Heterogeneous Treatment Effects Estimation. https://github.com/microsoft/EconML, 2019. Version 0.x.

BibTex:

@misc{econml,
  author={Microsoft Research},
  title={{EconML}: {A Python Package for ML-Based Heterogeneous Treatment Effects Estimation}},
  howpublished={https://github.com/microsoft/EconML},
  note={Version 0.x},
  year={2019}
}

Contributing and Feedback

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

References

Athey, Susan, and Stefan Wager. Policy learning with observational data. Econometrica 89.1 (2021): 133-161.

X Nie, S Wager. Quasi-Oracle Estimation of Heterogeneous Treatment Effects. Biometrika, 2020

V. Syrgkanis, V. Lei, M. Oprescu, M. Hei, K. Battocchi, G. Lewis. Machine Learning Estimation of Heterogeneous Treatment Effects with Instruments. Proceedings of the 33rd Conference on Neural Information Processing Systems (NeurIPS), 2019 (Spotlight Presentation)

D. Foster, V. Syrgkanis. Orthogonal Statistical Learning. Proceedings of the 32nd Annual Conference on Learning Theory (COLT), 2019 (Best Paper Award)

M. Oprescu, V. Syrgkanis and Z. S. Wu. Orthogonal Random Forest for Causal Inference. Proceedings of the 36th International Conference on Machine Learning (ICML), 2019.

S. Künzel, J. Sekhon, J. Bickel and B. Yu. Metalearners for estimating heterogeneous treatment effects using machine learning. Proceedings of the national academy of sciences, 116(10), 4156-4165, 2019.

S. Athey, J. Tibshirani, S. Wager. Generalized random forests. Annals of Statistics, 47, no. 2, 1148--1178, 2019.

V. Chernozhukov, D. Nekipelov, V. Semenova, V. Syrgkanis. Plug-in Regularized Estimation of High-Dimensional Parameters in Nonlinear Semiparametric Models. Arxiv preprint arxiv:1806.04823, 2018.

S. Wager, S. Athey. Estimation and Inference of Heterogeneous Treatment Effects using Random Forests. Journal of the American Statistical Association, 113:523, 1228-1242, 2018.

Jason Hartford, Greg Lewis, Kevin Leyton-Brown, and Matt Taddy. Deep IV: A flexible approach for counterfactual prediction. Proceedings of the 34th International Conference on Machine Learning, ICML'17, 2017.

V. Chernozhukov, D. Chetverikov, M. Demirer, E. Duflo, C. Hansen, and a. W. Newey. Double Machine Learning for Treatment and Causal Parameters. ArXiv preprint arXiv:1608.00060, 2016.

Dudik, M., Erhan, D., Langford, J., & Li, L. Doubly robust policy evaluation and optimization. Statistical Science, 29(4), 485-511, 2014.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

econml-0.12.0b1.tar.gz (1.3 MB view details)

Uploaded Source

Built Distributions

econml-0.12.0b1-cp38-cp38-win_amd64.whl (907.2 kB view details)

Uploaded CPython 3.8 Windows x86-64

econml-0.12.0b1-cp38-cp38-win32.whl (805.7 kB view details)

Uploaded CPython 3.8 Windows x86

econml-0.12.0b1-cp38-cp38-manylinux2010_x86_64.whl (3.3 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ x86-64

econml-0.12.0b1-cp38-cp38-manylinux2010_i686.whl (3.1 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.12+ i686

econml-0.12.0b1-cp38-cp38-manylinux1_x86_64.whl (3.3 MB view details)

Uploaded CPython 3.8

econml-0.12.0b1-cp38-cp38-manylinux1_i686.whl (3.1 MB view details)

Uploaded CPython 3.8

econml-0.12.0b1-cp38-cp38-macosx_10_9_x86_64.whl (914.4 kB view details)

Uploaded CPython 3.8 macOS 10.9+ x86-64

econml-0.12.0b1-cp37-cp37m-win_amd64.whl (898.1 kB view details)

Uploaded CPython 3.7m Windows x86-64

econml-0.12.0b1-cp37-cp37m-win32.whl (796.7 kB view details)

Uploaded CPython 3.7m Windows x86

econml-0.12.0b1-cp37-cp37m-manylinux2010_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ x86-64

econml-0.12.0b1-cp37-cp37m-manylinux2010_i686.whl (2.8 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.12+ i686

econml-0.12.0b1-cp37-cp37m-manylinux1_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.7m

econml-0.12.0b1-cp37-cp37m-manylinux1_i686.whl (2.8 MB view details)

Uploaded CPython 3.7m

econml-0.12.0b1-cp37-cp37m-macosx_10_9_x86_64.whl (915.9 kB view details)

Uploaded CPython 3.7m macOS 10.9+ x86-64

econml-0.12.0b1-cp36-cp36m-win_amd64.whl (897.6 kB view details)

Uploaded CPython 3.6m Windows x86-64

econml-0.12.0b1-cp36-cp36m-win32.whl (796.5 kB view details)

Uploaded CPython 3.6m Windows x86

econml-0.12.0b1-cp36-cp36m-manylinux2010_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.6m manylinux: glibc 2.12+ x86-64

econml-0.12.0b1-cp36-cp36m-manylinux2010_i686.whl (2.8 MB view details)

Uploaded CPython 3.6m manylinux: glibc 2.12+ i686

econml-0.12.0b1-cp36-cp36m-manylinux1_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.6m

econml-0.12.0b1-cp36-cp36m-manylinux1_i686.whl (2.8 MB view details)

Uploaded CPython 3.6m

econml-0.12.0b1-cp36-cp36m-macosx_10_9_x86_64.whl (919.2 kB view details)

Uploaded CPython 3.6m macOS 10.9+ x86-64

File details

Details for the file econml-0.12.0b1.tar.gz.

File metadata

  • Download URL: econml-0.12.0b1.tar.gz
  • Upload date:
  • Size: 1.3 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1.tar.gz
Algorithm Hash digest
SHA256 59cb187013eb45956a6368479d1df126eb828befb7e479e26f61b2d724993fce
MD5 e288fec432e6011ee0756e977aaab352
BLAKE2b-256 f52f7f24590e78e8f811edc655856e78a3442eb29c128f9b010a6794fec7115d

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 907.2 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 520251e4b7f791c12a0cf9c4960f62ebc56fd96db1858fb971714e93d1ff5b23
MD5 0c9c2486e520c371539b19ef5a4b743c
BLAKE2b-256 fa8ec83738353fa1c9fbe58be2d4d17b71774fd87f1cd80b6eaf1d9130972643

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp38-cp38-win32.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp38-cp38-win32.whl
  • Upload date:
  • Size: 805.7 kB
  • Tags: CPython 3.8, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp38-cp38-win32.whl
Algorithm Hash digest
SHA256 91fd9f231ffe61bc673c7f6cd2bdd1357e6f799939522fe881d575ce8ef5d91a
MD5 b0789e1565492767626b2cbd9e682d43
BLAKE2b-256 fec936566f2fa545446dc4a655e8e376e1720f2a06d86d39c7523331205a2efd

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp38-cp38-manylinux2010_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp38-cp38-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.8, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp38-cp38-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 bd875c85e40b78e439bfdfe63825d9f332f3b021d0fdc94a37137814ee0deb89
MD5 487830db89afc258923bddbd5bf5221a
BLAKE2b-256 b0c083f7101c9aceaa3f6407a116008bf884cf4c9cf9d2a2329d936afa4a732d

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp38-cp38-manylinux2010_i686.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp38-cp38-manylinux2010_i686.whl
  • Upload date:
  • Size: 3.1 MB
  • Tags: CPython 3.8, manylinux: glibc 2.12+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp38-cp38-manylinux2010_i686.whl
Algorithm Hash digest
SHA256 68f81debbaa0db57fb441867a6f3cdfb27838b89e60da2dcb7efecee3a3a529f
MD5 0a2baf22304e218925fac5e068e52cbb
BLAKE2b-256 69d6e6c66ef54e74db4c4831bec1ed9858ed9d9744a24649d871ccabd847bbc6

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp38-cp38-manylinux1_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp38-cp38-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.3 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp38-cp38-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 53bf20df4af358f582cae457c70ee4ae82c8bac2d346e187250ca37a44e444e3
MD5 d2fcb4bc459c0396769cbe994a26a0df
BLAKE2b-256 6d5ca87cf2cd56a45c997cc627486a0ddb294f388757f2eb85d0bf75a3d786a4

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp38-cp38-manylinux1_i686.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp38-cp38-manylinux1_i686.whl
  • Upload date:
  • Size: 3.1 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp38-cp38-manylinux1_i686.whl
Algorithm Hash digest
SHA256 3213d3d19fd455157474511abda18589dad4a44163f6c347eba57d7dfe71547b
MD5 5a95520af845866b06e83612331d2da6
BLAKE2b-256 e225bbe1db6c4954bb891b86b965879be6bcdef5d67c70ea345dcf0fb2a29cb5

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp38-cp38-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp38-cp38-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 914.4 kB
  • Tags: CPython 3.8, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp38-cp38-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 1cd01f819841b345792871bd13aceef8b52fef7d87c2b890434ecc53346c1d51
MD5 67c572c20059107526a9df9380f0436d
BLAKE2b-256 bdf0d78e40a7a69c24416d18074ef5ccb5db51facec934709170d073a7ff2629

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 898.1 kB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 aaa88bc1b53e79f3669c3255d8a3c4c5f1aa99ccc613e4829590fe47666f41a4
MD5 d07c70c413838411c56cb98c63a2046c
BLAKE2b-256 114061cebf32497ef31942be6f188d5c212ccd90a94b89d96a0ceb3120a3faf0

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp37-cp37m-win32.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp37-cp37m-win32.whl
  • Upload date:
  • Size: 796.7 kB
  • Tags: CPython 3.7m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp37-cp37m-win32.whl
Algorithm Hash digest
SHA256 77317d3738b1e7d90e5bdd1f77f12025fd27b80ece6e2a143731631991f7aa0c
MD5 daf9439ce2b849c1a5a6a85bf74d9314
BLAKE2b-256 624e9a7ce7789159336ba36034ebc784ba0dd9a912d7a7ef9ac233adde718d68

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp37-cp37m-manylinux2010_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp37-cp37m-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.7m, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp37-cp37m-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 0e6e60fd3abfc7cd72b2d508d0b78345e7420148623ed430311cfaf295ca549d
MD5 b3cbb7a08ac206f22158751ef9a631bf
BLAKE2b-256 9d9d8edf92d1822b06f6bffe5b05255ff7a8a8f4fdbc2fb1f57bf8740d85c902

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp37-cp37m-manylinux2010_i686.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp37-cp37m-manylinux2010_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.7m, manylinux: glibc 2.12+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp37-cp37m-manylinux2010_i686.whl
Algorithm Hash digest
SHA256 dc000815a2cb25f36cdb55c9913b225994c35ab332fe4a363c6901502a805acc
MD5 e7ec0204e6aae0b34d93ecec0d81a6b0
BLAKE2b-256 7e3f61f023464cdf67f88f3b0b561e6850dd631f1d1950985f1277997592ee83

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 3983f23936ed36f0daec330f7f6d7233f84b71108f92c8c2d7a20953538867e6
MD5 bcdb381b43a6c5ef4eeec88449bbf605
BLAKE2b-256 15ee5c142112af34b5e959b1698f62207bb85477ef804e964e46beefc6098676

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp37-cp37m-manylinux1_i686.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp37-cp37m-manylinux1_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp37-cp37m-manylinux1_i686.whl
Algorithm Hash digest
SHA256 42b0bdb87b5e14b2730c1bf69ebb8f36e030ae20c83aa319156e7d7bc9f986e0
MD5 478e3727f0d59c8bd49366ab0854e3e2
BLAKE2b-256 c9401a45a73edea952f6e51a10b3b200783e938bafa1acead86661c816e32894

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp37-cp37m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp37-cp37m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 915.9 kB
  • Tags: CPython 3.7m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp37-cp37m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 24bf1bf8c7b06ebba47626fba70462844135ff19f6d4c6405d6f0840f3f24161
MD5 7f7053e00b1b6444c3fcd7f8e12871eb
BLAKE2b-256 801ca962e8df52c667a4bf849ed0fd0e58f5c7d142ed5b79b74e777b219fa832

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 897.6 kB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 f77b2815f5ea0167dd1d09f296e16698c0935e1ea37ede11a893845370e91c16
MD5 438b89d686eb9b5d1500005da3660187
BLAKE2b-256 531738030ef6c16ca3bbe822c8ab744f7f0654db30a33a5278b5ccd99eb3cf7e

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp36-cp36m-win32.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp36-cp36m-win32.whl
  • Upload date:
  • Size: 796.5 kB
  • Tags: CPython 3.6m, Windows x86
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp36-cp36m-win32.whl
Algorithm Hash digest
SHA256 899d6ff74dc8e442d0e4259275ab807c183b2b7bd528d97a26829e4b17de279e
MD5 d4d293695938cc908049774903ee9299
BLAKE2b-256 5387c864148c1dae9b59b427e48b7dd3718e75c49456f3cc9c9e8d94c13d1bc2

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp36-cp36m-manylinux2010_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp36-cp36m-manylinux2010_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.6m, manylinux: glibc 2.12+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp36-cp36m-manylinux2010_x86_64.whl
Algorithm Hash digest
SHA256 af88b208cb6764ab5cd9ad2211d28ccefb0d545d2393d88d5bab8bde6714caa6
MD5 6eecfd3e5909e72ede8a2453af1a4d4e
BLAKE2b-256 1adbd79ee3d72e5ed951d3c51f544bf98e7e5e47bec26f2e684a58a60f9da86e

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp36-cp36m-manylinux2010_i686.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp36-cp36m-manylinux2010_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.6m, manylinux: glibc 2.12+ i686
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp36-cp36m-manylinux2010_i686.whl
Algorithm Hash digest
SHA256 8312cd856ea41108c0994235c704fac643573ca11ba1e50d44f8bde2024ba06b
MD5 2fc9a70dc1f162b2c087acc5051d2f71
BLAKE2b-256 dec48cb727bdf100197459fa3d7a14ef31ac920933139524f15a1d1d23f16711

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 3.0 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 e8b06c9aeb34a3d964b92b0160b459034f049f6b2c1283010ee55c8c2e15124b
MD5 c5520412b3728524415a5821067f8c29
BLAKE2b-256 6bedca775d4f178fc42f17cd757229a52ad118f46473ec3c6353634077e3c9b5

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp36-cp36m-manylinux1_i686.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp36-cp36m-manylinux1_i686.whl
  • Upload date:
  • Size: 2.8 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp36-cp36m-manylinux1_i686.whl
Algorithm Hash digest
SHA256 ac4ed4ffbf2d002ab4d0d695e6a7f51ed26ae0329d62bd130bf36b633746f8c6
MD5 c7cf7ab6914147e2e0cd51d4d5aec221
BLAKE2b-256 35480548faec4b75733c80111c7db7f361ade910fb8a8522a60f9e0bffe61dec

See more details on using hashes here.

File details

Details for the file econml-0.12.0b1-cp36-cp36m-macosx_10_9_x86_64.whl.

File metadata

  • Download URL: econml-0.12.0b1-cp36-cp36m-macosx_10_9_x86_64.whl
  • Upload date:
  • Size: 919.2 kB
  • Tags: CPython 3.6m, macOS 10.9+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/51.0.0 requests-toolbelt/0.9.1 tqdm/4.54.1 CPython/3.6.6

File hashes

Hashes for econml-0.12.0b1-cp36-cp36m-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 790c8c932c28b93edbc065c9b565a9203baab80557167335ef61f649401c4dae
MD5 ac64d772ac8a29442a57dddea70b3b5a
BLAKE2b-256 f4283b37effee7a2980bbeab700e4e875ea78da08e08dc0f48ba3c972c2e4c73

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page