Skip to main content

Monte Carlo Post-analysis package for global sensitivity analysis and integration

Project description

MCPost: Monte Carlo Post-analysis Package

PyPI version Python 3.8+ License: MIT

MCPost is a comprehensive Python package for post-analysis of Monte Carlo samples, providing tools for global sensitivity analysis (GSA) and Monte Carlo integration with modern packaging standards and extensive documentation.

Features

Global Sensitivity Analysis

  • Multiple sensitivity metrics: Mutual Information, Distance Correlation, Permutation Importance
  • Gaussian Process surrogates with Automatic Relevance Determination (ARD)
  • Sobol' indices for variance-based sensitivity analysis
  • Partial Dependence Plots for interpretable results
  • Robust preprocessing with automatic constant column detection

Monte Carlo Integration

  • Standard Monte Carlo integration with importance sampling
  • Quasi-Monte Carlo methods (Sobol, Halton sequences)
  • Automatic integration with adaptive sampling strategies
  • Flexible PDF specification for target and sampling distributions

Modern Package Features

  • Type hints and comprehensive documentation
  • Modular design with clean public APIs
  • Optional dependencies for visualization and development
  • Extensive testing with property-based tests
  • Performance optimizations for large datasets

Installation

MCPost supports multiple installation methods to suit different use cases:

Basic Installation

For core functionality (GSA and integration without plotting):

pip install mcpost

Installation with Visualization Support

For full functionality including plotting and visualization:

pip install mcpost[viz]

Development Installation

For contributors and developers:

# Clone the repository
git clone https://github.com/mcpost/mcpost.git
cd mcpost

# Install in development mode with all dependencies
pip install -e .[dev]

# Run tests to verify installation
pytest

Installation from Source

For the latest development version:

pip install git+https://github.com/zzhang0123/mcpost.git

Conda Installation

MCPost will be available on conda-forge (coming soon):

conda install -c conda-forge mcpost

Quick Start

Global Sensitivity Analysis

MCPost provides comprehensive GSA capabilities with multiple sensitivity metrics:

import numpy as np
from mcpost import gsa_pipeline

# Generate sample data (Ishigami function example)
np.random.seed(42)
n_samples = 1000
X = np.random.uniform(-np.pi, np.pi, (n_samples, 3))

# Ishigami function: f(x1,x2,x3) = sin(x1) + 7*sin(x2)^2 + 0.1*x3^4*sin(x1)
y = (np.sin(X[:, 0]) + 
     7 * np.sin(X[:, 1])**2 + 
     0.1 * X[:, 2]**4 * np.sin(X[:, 0]))
Y = y.reshape(-1, 1)

# Run comprehensive GSA analysis
results = gsa_pipeline(
    X, Y,
    param_names=["x1", "x2", "x3"],
    feature_names=["ishigami"],
    scaler="minmax",
    enable_sobol=True,
    enable_gp=True,
    enable_perm=True,
    make_pdp=True
)

# View sensitivity results
print("Sensitivity Analysis Results:")
print(results["results"]["ishigami"]["table"])

# Plot sensitivity metrics (requires matplotlib)
from mcpost import plot_sensitivity_metrics
plot_sensitivity_metrics(results, save_path="sensitivity_plot.png")

Advanced GSA Usage

# Custom GSA configuration
from mcpost import GSAConfig, gsa_for_target

# Configure GSA parameters
config = GSAConfig()
config.DEFAULT_SCALER = "standard"
config.DEFAULT_N_SOBOL = 8192

# Run GSA for specific target
target_results = gsa_for_target(
    X, Y[:, 0],  # Single target
    param_names=["x1", "x2", "x3"],
    target_name="ishigami",
    scaler=config.DEFAULT_SCALER,
    n_sobol=config.DEFAULT_N_SOBOL
)

Monte Carlo Integration

MCPost supports various integration methods for different use cases:

import numpy as np
from mcpost import monte_carlo_integral, qmc_integral_auto

# Define integration problem: E[x*sin(y)] where (x,y) ~ N(0,I)
def target_pdf(theta):
    """Target probability density function (standard normal)"""
    return np.exp(-0.5 * np.sum(theta**2, axis=1)) / (2 * np.pi)

def integrand(theta):
    """Function to integrate: f(x,y) = x * sin(y)"""
    return theta[:, 0] * np.sin(theta[:, 1])

# Method 1: Standard Monte Carlo
np.random.seed(42)
theta_samples = np.random.normal(0, 1, (5000, 2))
f_values = integrand(theta_samples)

mc_result = monte_carlo_integral(theta_samples, f_values, target_pdf)
print(f"Monte Carlo result: {mc_result['integral']:.6f} ± {mc_result['uncertainty']:.6f}")

# Method 2: Quasi-Monte Carlo (automatic)
qmc_result = qmc_integral_auto(
    N_samples=4096,
    N_params=2, 
    data_func=integrand,
    p_target=target_pdf,
    bounds=[(-4, 4), (-4, 4)]  # Integration bounds
)
print(f"QMC result: {qmc_result['integral']:.6f}")

# Method 3: QMC with importance sampling
from mcpost import qmc_integral_importance

def importance_pdf(theta):
    """Importance sampling distribution"""
    return np.exp(-0.25 * np.sum(theta**2, axis=1)) / (4 * np.pi)

qmc_is_result = qmc_integral_importance(
    N_samples=2048,
    N_params=2,
    data_func=integrand,
    p_target=target_pdf,
    q_sample=importance_pdf,
    bounds=[(-3, 3), (-3, 3)]
)
print(f"QMC + Importance Sampling: {qmc_is_result['integral']:.6f}")

Integration with Custom Distributions

# Example: Integration over custom parameter space
def custom_target(theta):
    """Custom target distribution (mixture of Gaussians)"""
    comp1 = 0.6 * np.exp(-0.5 * np.sum((theta - 1)**2, axis=1))
    comp2 = 0.4 * np.exp(-0.5 * np.sum((theta + 1)**2, axis=1))
    return (comp1 + comp2) / (2 * np.pi)

def complex_integrand(theta):
    """More complex integrand"""
    return np.exp(theta[:, 0]) * np.cos(theta[:, 1]) * theta[:, 0]**2

# Use adaptive QMC integration
result = qmc_integral_auto(
    N_samples=8192,
    N_params=2,
    data_func=complex_integrand,
    p_target=custom_target,
    bounds=[(-3, 3), (-3, 3)],
    qmc_method="sobol"  # or "halton"
)

Documentation and Resources

Complete Documentation

Quick References

Learning Resources

Example Applications

Requirements

Core Dependencies

  • Python 3.8+
  • NumPy >= 1.20.0
  • Pandas >= 1.3.0
  • Scikit-learn >= 1.0.0
  • SciPy >= 1.7.0
  • dcor >= 0.5.0
  • SALib >= 1.4.0

Optional Dependencies

  • Visualization: matplotlib >= 3.5.0
  • Development: pytest, hypothesis, black, mypy
  • Documentation: sphinx, jupyter, nbsphinx

Contributing

We welcome contributions! Please see our Contributing Guide for details.

Development Setup

git clone https://github.com/zzhang0123/mcpost.git
cd mcpost
pip install -e .[dev]
pytest

Testing

MCPost includes a comprehensive test suite:

# Run all tests (excludes backward compatibility tests)
pytest tests/

# Run specific test categories
pytest tests/test_gsa/          # GSA functionality tests
pytest tests/test_integration/  # Integration tests
pytest tests/test_utils/        # Utility tests

# Run property-based tests
pytest tests/ -k "property"

# Run with coverage
pytest tests/ --cov=mcpost --cov-report=html

Backward Compatibility Tests: These require the legacy mock files gsa_pipeline.py and mc_int.py located in tests/legacy_mocks/ and are skipped in CI. For local development:

# Place original files in repository root, then:
pytest tests/test_gsa_backward_compatibility.py
pytest tests/test_integration_backward_compatibility.py

License

This project is licensed under the MIT License - see the LICENSE file for details.

Citation

If you use MCPost in your research, please cite:

@software{mcpost,
  title={MCPost: Monte Carlo Post-analysis Package},
  author={MCPost Contributors},
  url={https://github.com/zzhang0123/mcpost},
  version={0.1.0},
  year={2024}
}

Acknowledgments

MCPost builds upon several excellent open-source libraries:

  • Scikit-learn for machine learning algorithms
  • SALib for Sobol' sensitivity analysis
  • dcor for distance correlation
  • SciPy for scientific computing
  • NumPy for numerical computing

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mc_post-0.1.1.tar.gz (114.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mc_post-0.1.1-py3-none-any.whl (55.5 kB view details)

Uploaded Python 3

File details

Details for the file mc_post-0.1.1.tar.gz.

File metadata

  • Download URL: mc_post-0.1.1.tar.gz
  • Upload date:
  • Size: 114.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for mc_post-0.1.1.tar.gz
Algorithm Hash digest
SHA256 6f16bb0f6cfbc846a3c82bf817fdb76e45487854a7870a4bd8c52d32f1b4c7c3
MD5 147f78b456b47a01e0fe57079eef9874
BLAKE2b-256 0d789a3864873d4084d30c16fe7b56701f536ae2eef79db3ae45884309ee4554

See more details on using hashes here.

Provenance

The following attestation bundles were made for mc_post-0.1.1.tar.gz:

Publisher: release.yml on zzhang0123/mcpost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file mc_post-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: mc_post-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 55.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for mc_post-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ae80c2527fac6b651f86b688c8862bd5b5b9daf319ee0e6a42b20ee863c9669f
MD5 aede713e28ca412cf6b6fcaa2a04afa6
BLAKE2b-256 a2e41a8b05eb8f187fedd4a3965150bc7bb69e5b0ea1323b88a6ec7d7f91abc9

See more details on using hashes here.

Provenance

The following attestation bundles were made for mc_post-0.1.1-py3-none-any.whl:

Publisher: release.yml on zzhang0123/mcpost

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page