Skip to main content

Package Containing Modular CMA-ES optimizer

Project description

ModularCMAES Unittest Codacy Badge Codacy Badge

The Modular CMA-ES is a Python and C++ package that provides a modular implementation of the Covariance Matrix Adaptation Evolution Strategy (CMA-ES) algorithm. This package allows you to create various algorithmic variants of CMA-ES by enabling or disabling different modules, offering flexibility and customization in evolutionary optimization. In addition to the CMA-ES, the library includes an implementation of the Matrix Adaptation Evolution Strategy (MA-ES) algorithm, which has similar emprical performance on most problems, but signifanctly lower runtime. All modules implemented are compatible with both the CMA-ES and MA-ES.

This implementation is based on the algorithm introduced in the paper "Evolving the Structure of Evolution Strategies. (2016)" by Sander van Rijn et. al. If you would like to cite this work in your research, please cite the paper: "Tuning as a Means of Assessing the Benefits of New Ideas in Interplay with Existing Algorithmic Modules (2021)" by Jacob de Nobel, Diederick Vermetten, Hao Wang, Carola Doerr and Thomas Bäck.

This README provides a high level overview of the implemented modules, and provides some usage examples for both the Python-only and the C++-based versions of the framework. A complete API documentation can be found here (under construction).

Table of Contents

  1. Installation
  2. Usage
  3. Modules
  4. Citation
  5. License

Installation

You can install the Modular CMA-ES package using pip.

Python Installation

pip install modcma

Installation from source

If you want to work on a development version of the library, you should follow the following steps. A C++ compiler is required, and the following is valid for g++ (v11.1.0):

  1. Clone the repository:

    git clone git@github.com:IOHprofiler/ModularCMAES.git
    cd ModularCMAES
    
  2. Install dependencies (in a virtual environment)

    python3 -m venv env
    source ./env/bin/activate
    pip install -r requirements.txt   
    
  3. Compile the library, we can optionally install the package globally:

    python setup.py install
    

    or install in develop mode, which is recommended if one would like to actively develop the library:

    python setup.py develop 
    
  4. Ensure all functionality works as expected by running the unittests:

    python -m unittest discover   
    

Usage

To optimize a single function, we provide a basic fmin interface, which requires two parameters: func, which is the function to be minimized, and x0, the initial estimate for the function. Optionally, any parameter that is valid for the ModularCMAES class, is valid for this function as keyword argument. For example, to minimize the value of the sum function, in 4D with a budget of 1000 function evaluations, using an active CMA-ES with an intial stepsize $\sigma$ of 2.5, we could use the following:

from modcma import fmin
xopt, fopt, used_budget = fmin(func=sum, x0=[1, 2, 3, 4], budget=1000, active=True, sigma0=2.5)

Python-only

The Python-only implentation revolves around the ModularCMAES class. The class has a run method, which runs the specified algorithm until any break conditions arise.

from modcma import ModularCMAES

def func(x: np.ndarray):
   return sum(x)

dim = 10
budget = 10_000

# Create an instance of the CMA-ES (no modules active)
cma = ModularCMAES(func, dim, budget=budget)

# Run until break conditions are met
cma = cma.run()

Alternatively, we could also iteratively run the step method, for a more fine grained control on how the algorithm is executed.

cma = ModularCMAES(func, dim, budget=budget)

while not cma.break_conditions():
   cma.step()

At an even lower level, we could run all methods ran by the step methods seperately, which are (in order) mutate, select, recombine and adapt. The following snippet shows an example of all three methods.

cma = ModularCMAES(func, dim, budget=budget)

while not cma.break_conditions():
   cma.mutate()
   cma.select()
   cma.recombine()
   cma.adapt()

Ask-Tell Interface

Often, it can be usefull consider the algorithm in an Ask-Tell fashion, such that we can sequentally evaluate points while having outside control of the objective function. For this purpose, we provide the AskTellCMAES interface, which can be used as follows:

from modcma import AskTellCMAES

# Instantiate an ask-tell cmaes. Note that the objective function argument is omitted here. 
# All other parameters, e.g. the active modules can be passed by keyword, similar to ModularCMAES
cma = AskTellCMAES(dim, budget=budget, active=True)

while not cma.break_conditions():
   # Retrieve a single new candidate solution
   xi = cma.ask()
   # Evaluate the objective function
   fi = func(xi)
   # Update the algorithm with the objective function value
   cma.tell(xi, fi)

C++ Backend

For obvious performance reasons, we've also implemented the algorithm in C++, with an interface to Python. The algorithm can be accessed similarly in Python, but calling it is slightly more verbose. The ModularCMAES class in C++ accepts a single argument, which is an Parameters object. This object must be instantiated with a Settings object, which in turn is built from the problem dimension and a Modules object, which can be used to specify certain module options. A boilerplate code example for this process is given in the following:

# import the c++ subpackage
from modcma import c_maes
# Instantate a modules object
modules = c_maes.parameters.Modules()
# Create a settings object, here also optional parameters such as sigma0 can be specified
settings = c_maes.parameters.Settings(dim, modules, sigma0 = 2.5)
# Create a parameters object
parameters = c_maes.Parameters(settings)
# Pass the parameters object to the ModularCMAES optimizer class
cma = c_maes.ModularCMAES(parameters)

Then, the API for both the Python-only and C++ interface is mostly similar, and a single run of the algorithm can be performed by using the run function. A difference is that now the objective function is a parameter of the run function, and not pass when the class is instantiated.

cma.run(func)

Similarly, the step function is also directly exposed:

while not cma.break_conditions():
   cma.step(func)

Or by calling the function in the step seperately:

while not cma.break_conditions():
   cma.mutate(func)
   cma.select()
   cma.recombine()
   cma.adapt()

Modules

The CMA-ES Modular package provides various modules, grouped into 13 categories. For each of these categories a given option can be selected, which can be arbitrarly combined. The following table lists the categories and the available options. Not all modules are available in both versions (i.e. some are only implemented in C++), an overview is given in the table. By default, the first option in the table is selected for a given category. Boolean modules, i.e. modules that only can be turned on or off are turned off by default.

Category Option Python C++
Matrix Adaptation Covariance :green_circle: :green_circle:
Matrix :red_circle: :green_circle:
Seperable :red_circle: :green_circle:
None :red_circle: :green_circle:
Active Update Off/On :green_circle: :green_circle:
Elitism Off/On :green_circle: :green_circle:
Orthogonal Sampling Off/On :green_circle: :green_circle:
Sequential Selection Off/On :green_circle: :green_circle:
Threshold Convergence Off/On :green_circle: :green_circle:
Sample Sigma Off/On :green_circle: :green_circle:
Base Sampler Gaussian :green_circle: :green_circle:
Sobol :green_circle: :green_circle:
Halton :green_circle: :green_circle:
Recombination Weights Default :green_circle: :green_circle:
Equal :green_circle: :green_circle:
$1/2^\lambda$ :green_circle: :green_circle:
Mirrored Sampling Off :green_circle: :green_circle:
On :green_circle: :green_circle:
Pairwise :green_circle: :green_circle:
Category Option Python C++
Step size adaptation CSA :green_circle: :green_circle:
TPA :green_circle: :green_circle:
MSR :green_circle: :green_circle:
PSR :green_circle: :green_circle:
XNES :green_circle: :green_circle:
MXNES :green_circle: :green_circle:
MPXNES :green_circle: :green_circle:
Restart Strategy Off :green_circle: :green_circle:
Restart :green_circle: :green_circle:
IPOP :green_circle: :green_circle:
BIPOP :green_circle: :green_circle:
Bound Correction Off :green_circle: :green_circle:
Saturate :green_circle: :green_circle:
Mirror :green_circle: :green_circle:
COTN :green_circle: :green_circle:
Toroidal :green_circle: :green_circle:
Uniform resample :green_circle: :green_circle:

Matrix Adaptation

The ModularCMAES can be turned into an implementation of the (fast)-MA-ES algortihm by changing the matrix_adaptation option from COVARIANCE to MATRIX in the Modules object. This is currently only available in the C++ version of the framework. An example of specifying this, using the required MatrixAdaptationType enum:

...
modules.matrix_adaptation = c_maes.options.MatrixAdaptationType.COVARIANCE
# or for MA-ES
modules.matrix_adaptation = c_maes.options.MatrixAdaptationType.MATRIX
# We can also only perform step-size-adaptation
modules.matrix_adaptation = c_maes.options.MatrixAdaptationType.NONE
# Or use the seperable CMA-ES
modules.matrix_adaptation = c_maes.options.MatrixAdaptationType.SEPERABLE

Active Update

In the standard update of the covariance matrix C in the CMA-ES algorithm, only the most successful mutations are considered. However, the Active Update, introduced by Jastrebski et al., offers an alternative approach. This module adapts the covariance matrix by incorporating the least successful individuals with negative weights in the update process.

For the Python only version, this can be enabled by passing the option active=True:

cma = ModularCMAES(func, dim, active=True)

For the C++ version, this can be done by setting the appropriate value in the Modules object:

...
modules.active = True

Elitism

When this option is selected, (𝜇 + 𝜆)-selection instead of (𝜇, 𝜆)-selection is enabled. This can be usefull to speed up convergence on unimodal problems, but can have a negative impact on population diversity.

For the Python only version, this can be enabled by passing the option elitist=True:

cma = ModularCMAES(func, dim, elitist=True)

For the C++ version, this can be done by setting the appropriate value in the Modules object:

...
modules.elitist = True

Orthogonal Sampling

Orthogonal Sampling was introduced by Wang et al. as an extension of Mirrored Sampling. This method improves sampling by ensuring that the newly sampled points in the population are orthonormalized using a Gram-Schmidt procedure.

For the Python only version, this can be enabled by passing the option orthogonal=True:

cma = ModularCMAES(func, dim, orthogonal=True)

And for C++:

...
modules.orthogonal = True

Sequential Selection

Sequential Selection option offers an alternative approach to selection, originally proposed by Brockhoff et al., which optimizes the use of objective function evaluations by immediately ranking and comparing candidate solutions with the current best solution. Then, whenever more than $\mu$ individuals have been sampled that improve on the current best found solution, no more additional function evaluations are performed.

For the Python only version, this can be enabled by passing the option sequential=True:

cma = ModularCMAES(func, dim, sequential=True)

And for C++:

...
modules.sequential_selection = True

Threshold Convergence

In evolutionary strategies (ES), balancing exploration and exploitation is a critical challenge. The Threshold Convergence option, proposed by Piad et al. [25], provides a method to address this issue. It aims to prolong the exploration phase of evolution by requiring mutation vectors to reach a specific length threshold. This threshold gradually decreases over successive generations to transition into local search.

For the Python only version, this can be enabled by passing the option threshold_convergence=True:

cma = ModularCMAES(func, dim, threshold_convergence=True)

And for C++:

...
modules.threshold_convergence = True

Sample Sigma

A method based on self-adaptation by co-evolution as seen in classical evolution strategies, where for each candidate solution the step size is sampled seperately from a lognormal distribution based on the global step size $\sigma$.

For the Python only version, this can be enabled by passing the option sample_sigma=True:

cma = ModularCMAES(func, dim, sample_sigma=True)

And for C++:

...
modules.sample_sigma = True

Quasi-Gaussian Sampling

Instead of performing the simple random sampling from the multivariate Gaussian, new solutions can alternatively be drawn from quasi-random sequences (a.k.a. low-discrepancy sequences). We implemented two options for this module, the Halton and Sobol sequences.

This can be selected by setting the base_sampler="sobol" or base_sampler="halton" in the Python only version:

cma = ModularCMAES(func, dim, base_sampler="gaussian")
# or 
cma = ModularCMAES(func, dim, base_sampler="sobol")
# or 
cma = ModularCMAES(func, dim, base_sampler="halton")

For C++, the BaseSampler enum should be provided to the sampler member of the Modules object:

...
modules.sampler = c_maes.options.BaseSampler.GAUSSIAN
# or
modules.sampler = c_maes.options.BaseSampler.SOBOL
# or
modules.sampler = c_maes.options.BaseSampler.HALTON

Recombination Weights

We implemented three different variants of the recombination weights used in the update of the strategy parameters, default, equal and $1/2\lambda$.

This can be selected by setting the weights_option="sobol" or weights_option="halton" in the Python only version:

cma = ModularCMAES(func, dim, weights_option="default")
# or 
cma = ModularCMAES(func, dim, weights_option="equal")
# or 
cma = ModularCMAES(func, dim, weights_option="1/2^lambda")

For C++, the RecombinationWeights enum should be provided to the weights member of the Modules object:

...
modules.weights = c_maes.options.RecombinationWeights.DEFAULT
# or
modules.weights = c_maes.options.RecombinationWeights.EQUAL
# or
modules.weights = c_maes.options.RecombinationWeights.HALF_POWER_LAMBDA

Mirrored Sampling

Mirrored Sampling, introduced by Brockhoff et al., aims to create a more evenly spaced sample of the search space. In this technique, half of the mutation vectors are drawn from the normal distribution, while the other half are the mirror image of the preceding random vectors. When using Pairwise Selection in combination with Mirrored Sampling, only the best point from each mirrored pair is selected for recombination. This approach ensures that the mirrored points do not cancel each other out during recombination. This module has three options, off, on and on + pairwise.

For Python, we can add the option mirrored="mirrored" or mirrored="mirrored pairwise".

cma = ModularCMAES(func, dim, mirrored=None)
# or
cma = ModularCMAES(func, dim, mirrored="mirrored")
# or 
cma = ModularCMAES(func, dim, mirrored="pairwise")

For C++ this can be configured using the c_maes.options.Mirror enum:

...
modules.mirrored = c_maes.options.Mirror.NONE
# or 
modules.mirrored = c_maes.options.Mirror.MIRRORED
# or 
modules.mirrored = c_maes.options.Mirror.PAIRWISE

Step size adaptation

Several methods for performing step size adaptation have been implemented in the framework. For more details on the implemented methods, we refer the interested reader to our 2021 paper.

The availble options for step_size_adaptation for the Python only interface are: {"csa", "tpa", "msr", "xnes", "m-xnes", "lp-xnes", "psr"}, for which one can be selected and pased to the algortihms als active option, for example:

cma = ModularCMAES(func, dim, step_size_adaptation="csa")
# or
cma = ModularCMAES(func, dim, step_size_adaptation="msr")

The same options are available for the C++ version, but should be passed via the StepSizeAdaptation enum, which has the following values available: {CSA, TPA, MSR, XNES, MXNEs, LPXNES, PSR} and can be configured via the ssa option:

...
modules.ssa = c_maes.options.StepSizeAdaptation.CSA
# or 
modules.ssa = c_maes.options.StepSizeAdaptation.MSR

Restart Strategy

Restarting an optimization algorithm, like CMA-ES, can be an effective way to overcome stagnation in the optimization process. The Modular CMA-ES package offers three restart strategies to help in such scenarios. The first restart option just restarts the algorithm. When IPOP is enabled, the algorithm employs a restart strategy that increases the size of the population after every restart. BIPOP on the other hand, not only changes the size of the population after a restart but alternates between larger and smaller population sizes.

For the Python only interface, this option can be configured with 4 values {None, "restart", "IPOP", "BIPOP"}:

cma = ModularCMAES(func, dim, local_restart=None)
# or
cma = ModularCMAES(func, dim, local_restart="IPOP")

For the C++ version these should be passed via the RestartStrategy enum, which has the following values available: {NONE, RESTART, IPOP, BIPOP, STOP} and can be configured via the restart_strategy option:

...
modules.restart_strategy = c_maes.options.RestartStrategy.NONE
# or 
modules.restart_strategy = c_maes.options.RestartStrategy.IPOP

Note that the C++ version has an addtional option here, STOP, which forces the algortihm to stop whenever a restart condition is met (not to be confused with a break condition).

Bound correction

Several methods for performing bound correction have been implemented in the framework. For more details on the implemented methods, we refer the interested reader to our 2021 paper.

The availble options for bound_correction for the Python only interface are: {None, "saturate", "unif_resample", "COTN", "toroidal", "mirror"}, for which one can be selected and pased to the algortihms als active option, for example:

cma = ModularCMAES(func, dim, bound_correction=None)
# or
cma = ModularCMAES(func, dim, bound_correction="saturate")

The same options are available for the C++ version, but should be passed via the CorrectionMethod enum, which has the following values available {NONE, SATURATE, UNIFORM_RESAMPLE, COTN, TOROIDAL MIRROR} and can be configure via the bound_correction option:

...
modules.bound_correction = c_maes.options.CorrectionMethod.NONE
# or 
modules.bound_correction = c_maes.options.CorrectionMethod.SATURATE

Citation

The following BibTex entry can be used for the citation.

@inproceedings{denobel2021,
   author = {de Nobel, Jacob and Vermetten, Diederick and Wang, Hao and Doerr, Carola and B\"{a}ck, Thomas},
   title = {Tuning as a Means of Assessing the Benefits of New Ideas in Interplay with Existing Algorithmic Modules},
   year = {2021},
   isbn = {9781450383516},
   publisher = {Association for Computing Machinery},
   address = {New York, NY, USA},
   url = {https://doi.org/10.1145/3449726.3463167},
   doi = {10.1145/3449726.3463167},
   booktitle = {Proceedings of the Genetic and Evolutionary Computation Conference Companion},
   pages = {1375–1384},
   numpages = {10},
   location = {Lille, France},
   series = {GECCO '21}
}

License

This project is licensed under the MIT License - see the LICENSE file for details.


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

modcma-1.0.8.tar.gz (1.1 MB view details)

Uploaded Source

Built Distributions

modcma-1.0.8-cp312-cp312-win_amd64.whl (514.7 kB view details)

Uploaded CPython 3.12 Windows x86-64

modcma-1.0.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (776.4 kB view details)

Uploaded CPython 3.12 manylinux: glibc 2.17+ x86-64

modcma-1.0.8-cp312-cp312-macosx_11_0_arm64.whl (641.2 kB view details)

Uploaded CPython 3.12 macOS 11.0+ ARM64

modcma-1.0.8-cp311-cp311-win_amd64.whl (514.0 kB view details)

Uploaded CPython 3.11 Windows x86-64

modcma-1.0.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (778.7 kB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

modcma-1.0.8-cp311-cp311-macosx_11_0_arm64.whl (636.1 kB view details)

Uploaded CPython 3.11 macOS 11.0+ ARM64

modcma-1.0.8-cp310-cp310-win_amd64.whl (513.1 kB view details)

Uploaded CPython 3.10 Windows x86-64

modcma-1.0.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (778.5 kB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

modcma-1.0.8-cp310-cp310-macosx_11_0_arm64.whl (634.4 kB view details)

Uploaded CPython 3.10 macOS 11.0+ ARM64

modcma-1.0.8-cp39-cp39-win_amd64.whl (543.3 kB view details)

Uploaded CPython 3.9 Windows x86-64

modcma-1.0.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (779.0 kB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

modcma-1.0.8-cp39-cp39-macosx_11_0_arm64.whl (634.6 kB view details)

Uploaded CPython 3.9 macOS 11.0+ ARM64

modcma-1.0.8-cp38-cp38-win_amd64.whl (513.0 kB view details)

Uploaded CPython 3.8 Windows x86-64

modcma-1.0.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (778.1 kB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

modcma-1.0.8-cp38-cp38-macosx_11_0_arm64.whl (634.4 kB view details)

Uploaded CPython 3.8 macOS 11.0+ ARM64

modcma-1.0.8-cp37-cp37m-win_amd64.whl (511.0 kB view details)

Uploaded CPython 3.7m Windows x86-64

modcma-1.0.8-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (784.3 kB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

modcma-1.0.8-cp36-cp36m-win_amd64.whl (510.6 kB view details)

Uploaded CPython 3.6m Windows x86-64

modcma-1.0.8-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (774.2 kB view details)

Uploaded CPython 3.6m manylinux: glibc 2.17+ x86-64

File details

Details for the file modcma-1.0.8.tar.gz.

File metadata

  • Download URL: modcma-1.0.8.tar.gz
  • Upload date:
  • Size: 1.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for modcma-1.0.8.tar.gz
Algorithm Hash digest
SHA256 4651f081f69a606751f9c7eb5767f89b2c2d8cd7b760593e07c64d79c4b7d961
MD5 1050de496df067accc365b6547d27898
BLAKE2b-256 3c9f856a35aecb5d4d92e8b91e3ee3902e45be8ec7ccdb288562eb4d0796ec0b

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp312-cp312-win_amd64.whl.

File metadata

  • Download URL: modcma-1.0.8-cp312-cp312-win_amd64.whl
  • Upload date:
  • Size: 514.7 kB
  • Tags: CPython 3.12, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for modcma-1.0.8-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 38bec7db579dd1c526079763f3b6a5401b2c69ae7200e611e303a7b5be250f7a
MD5 1f796e4fa6dd690420cac28a6bb9dc05
BLAKE2b-256 06e579231c1fafefc6b2b45c3b27731189ad140a1c1476439241cbf0c738d603

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for modcma-1.0.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 1f4a0569eadf14cf041eb5b51b0f4f5e94d184d6987582b86ccc018290d8124b
MD5 23d265a6ecc00924c9794eb59f2fb328
BLAKE2b-256 040bef2ac8ead7d733a3aa64aff5ea5c037d0cd8dedbb8d771af3e4b2a26f040

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for modcma-1.0.8-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 87bdceeb464e1aeb1fb26c62157b417d1fa6441e5e989a5bf9e0b9edafc7b00d
MD5 e936c104a3c648ae9957fc38f2293262
BLAKE2b-256 8911711ba2810c566c54662249f6cd03e62526f7e388f371acfe834ff8f70074

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: modcma-1.0.8-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 514.0 kB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for modcma-1.0.8-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 a529cce46079d907a5a8bfb930f083a827ad69481dd9ee037fd7d857d04edfed
MD5 17cd54e06a59fb7fb65847b9073c973d
BLAKE2b-256 52d7b7a7288ef80bc1f458132dc2229499d88812438b3e43190ef3d596e197f9

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for modcma-1.0.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 daaa7836bcdaeabb6229fa3a6a263496b5acddca19d2cd50d544a5ea34ee86fe
MD5 94f4f4831e95e59ba2703a000043f2e7
BLAKE2b-256 ef127bfbd5528ca3b5862526bc004519c09afbd99ba724a9d0d837fbaa485710

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for modcma-1.0.8-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 cd3c8631a4cad9a2095c5c0968e3dfaf106d22e1d8750b45631b8ba74a7e3668
MD5 d1f500a0d7883a5d28ba519abd0f3642
BLAKE2b-256 9360374a9f5788f7e3cdeaa497adc42c90e1292e9001af6de36d86b2b518c1f1

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: modcma-1.0.8-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 513.1 kB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for modcma-1.0.8-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 b91e1bf8d28c60eb92cee42f74a446c36231932bf6399da2f83d8f205fcc96e1
MD5 ae0272a4990a07b95360b5d95db2c54b
BLAKE2b-256 c3594c7514e749fa07d7f203b17abb9d2e98f4509db71a5f41769aba446e5c43

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for modcma-1.0.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 5f6236e0b558005e7bfd6a234b680718fe9d32a583d7f361b46e336d6d3488c1
MD5 1b106910eeb2e9d7d0cd009ef12df114
BLAKE2b-256 4a800541ae42f6145996072e762cbc0b66d74f0344728250a3fccc86b0a5d022

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for modcma-1.0.8-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 09b681fc5a7b8b4bc4506c1eb5efa0ef9b5717e951b1e8a8e1cdb75a9e5a5577
MD5 f3def79c9e4daee9bc4c1d1cc0c081d5
BLAKE2b-256 b90d646a5788e849a869f3d1dfbeb8778482a03783e1765a4088977cf1f2d6e1

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: modcma-1.0.8-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 543.3 kB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for modcma-1.0.8-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 971828cd772fd8b45cb98890a8a944bb7ab9dd0f161468700a76b7a7ad72b7ff
MD5 48e8dd5c668fb9577857a5c8dd01f94c
BLAKE2b-256 845ade451a8f2bb88ae3af8e12761c065b63b68591f164e70c690a97eb03c42c

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for modcma-1.0.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 a73f1bf126b6283e74fc7eace30b0c6a025a777e59d555f81862ab8d60981ba0
MD5 b6ce1a77ad08d0e20459a28c49931e47
BLAKE2b-256 0660aa81800152c36895d0b02aa2823780e1842d33366398a4fb72c1486f5e00

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for modcma-1.0.8-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 71e66b53a22e89294a54e4d119f52a2fe6e78fa9614b9d482ba10dd74a863064
MD5 ba7cd18ed364170255d3183966c23088
BLAKE2b-256 f053d645fcb62c2f1bd71010d57ebe6e53546195ebd8ad3b2f1509ac600ac1f1

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: modcma-1.0.8-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 513.0 kB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for modcma-1.0.8-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 4886b1b6538ca25ab661a76fa8986d7e99b2a66867a81a63856ebb585a2f49e5
MD5 c8f932c37e01e38f47fbee638e4d7b94
BLAKE2b-256 c10ae77e6be3f6dec45f48a373862fe80739f9dcd84e0cde4fb002ed1b49bd64

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for modcma-1.0.8-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 b52aa4df90aaf8ed61fa9743b64bdbec8a07829ca2d6f6b1a6c06ac42e672c88
MD5 cace98bf987b55a95719ffdffdb8d938
BLAKE2b-256 762e6a5defd70947c2dd1aedd91f1a2957ce3a27864bfa330a0d05f2b2e29bb3

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for modcma-1.0.8-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 c16a3e2e733c68bae1504ba9907cfea97e5be3667f21a689d549ae82abf68e8d
MD5 6407eb6ee351c26be297bf0555e68b3e
BLAKE2b-256 afe632780c53ed7bfb708e4df0ca3eadae8a21ff9664bfd642bce2fa621bd489

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: modcma-1.0.8-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 511.0 kB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for modcma-1.0.8-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 eecd691a410f0e3bb3573210346e5579199645dffb1851ed49190ca7069d7335
MD5 467daa1ee0afa62c8fc3df8e3aa2e969
BLAKE2b-256 f132856dd62baa4f349db35b9da54b4e1f893c947e4cf5ad63b2d4d08b7ea699

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for modcma-1.0.8-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 9c6dc32adab2b9456cb436661a0dc32a2186fe096c9617e52244aed2032140ac
MD5 60a6e527e8841a004745f79b1462f184
BLAKE2b-256 5bda39beb073953ac6b9b0f6c5acc21b95be3c16daa7ccc3897fe6168b72cab8

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: modcma-1.0.8-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 510.6 kB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for modcma-1.0.8-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 0be60ad4cf1f0500480243ad9ebb47445fb04958967acca77d6d7da65d2a9666
MD5 ae76e9579446c1815f9876660e7ef4d7
BLAKE2b-256 02f7fc1cde58577b16196d79af729eb5a9ca77139234650773c6e1f8602d5a36

See more details on using hashes here.

File details

Details for the file modcma-1.0.8-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for modcma-1.0.8-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 77011324de45b3b48123915d843936941c39a09f877bec7a3146a5db96ca3242
MD5 f5175fc0e650d2f1f469a235c1dd0653
BLAKE2b-256 16d2067cb88130b76bc6df96ab4cba31324cbd0904435d122d901f3b2c59acbb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page