Skip to main content

Save and load Bioconductor objects in Python

Project description

Project generated with PyScaffold PyPI-Server Monthly Downloads Unit tests

Save and load Bioconductor objects in Python

The dolomite-base package is the Python counterpart to the alabaster.base R package for language-agnostic reading and writing of Bioconductor objects (see the BiocPy project). This is a more robust and portable alternative to the typical approach of pickling Python objects to save them to disk.

  • By separating the on-disk representation from the in-memory object structure, we can more easily adapt to changes in class definitions. This improves robustness to Python environment updates.
  • By using standard file formats like HDF5 and CSV, we ensure that the objects can be easily read from other languages like R and Javascript. This improves interoperability between application ecosystems.
  • By breaking up complex Bioconductor objects into their components, we enable modular reads and writes to the backing store. We can easily read or update part of an object without having to consider the other parts.

The dolomite-base package defines the base generics to read and write the file structures along with the associated metadata. Implementations of these methods for various Bioconductor classes can be found in the other dolomite packages like dolomite-ranges and dolomite-se.

Quick start

First, we'll install the dolomite-base package. This package is available from PyPI so we can use the standard installation process:

pip install dolomite-base

The simplest example involves saving a BiocFrame inside a staging directory. Let's mock one up:

import biocframe
df = biocframe.BiocFrame({
    "X": list(range(0, 10)),
    "Y": [ "a", "b", "c", "d", "e", "f", "g", "h", "i", "j" ]
})
print(df)
## BiocFrame with 10 rows and 2 columns
##           X      Y
##     <range> <list>
## [0]       0      a
## [1]       1      b
## [2]       2      c
## [3]       3      d
## [4]       4      e
## [5]       5      f
## [6]       6      g
## [7]       7      h
## [8]       8      i
## [9]       9      j

We save our BiocFrame to a user-specified directory with the save_object() function. This function saves its input object to file according to the relevant specification.

import tempfile
import os
tmp = tempfile.mkdtemp()

import dolomite_base
path = os.path.join(tmp, "my_df")
dolomite_base.save_object(df, path)

os.listdir(path)
## ['basic_columns.h5', 'OBJECT']

We load the contents of the directory back into a Python session by using the read_object() function. Note that the exact Python types for the BiocFrame columns may not be preserved by the round trip, though the contents of the columns will be unchanged.

out = dolomite_base.read_object(path)
print(out)
BiocFrame with 10 rows and 2 columns
##                    X            Y
##     <ndarray[int32]> <StringList>
## [0]                0            a
## [1]                1            b
## [2]                2            c
## [3]                3            d
## [4]                4            e
## [5]                5            f
## [6]                6            g
## [7]                7            h
## [8]                8            i
## [9]                9            j

Check out the API reference for more details.

Supported classes

The saving/reading process can be applied to a range of BiocPy data structures, provided the appropriate dolomite package is installed. Each package implements a saving and reading function for its associated classes, which are automatically used from dolomite-base's save_object() and read_object() functions, respectively. (That is, there is no need to explicitly import a package when calling save_object() or read_object() for its classes.)

Package Object types PyPI
dolomite-base BiocFrame, list, dict, NamedList
dolomite-matrix numpy.ndarray, scipy.sparse.spmatrix, DelayedArray
dolomite-ranges GenomicRanges, GenomicRangesList
dolomite-se SummarizedExperiment, RangedSummarizedExperiment
dolomite-sce SingleCellExperiment
dolomite-mae MultiAssayExperiment

All of the listed packages are available from PyPI and can be installed with the usual pip install procedure. Alternatively, to install all packages in one go, users can install the dolomite umbrella package.

Operating on directories

Users can move freely rename or relocate directories and the read_object() function will still work. For example, we can easily copy the entire directory to a new file system and everything will still be correctly referenced within the directory. The simplest way to share objects is to just zip or tar the staging directory for ad hoc distribution, though more serious applications will use storage systems like AWS S3 for easier distribution.

# Mocking up an object:
import biocframe
df = biocframe.BiocFrame({
    "X": list(range(0, 10)),
    "Y": [ "a", "b", "c", "d", "e", "f", "g", "h", "i", "j" ]
})

# Saving to one location:
import tempfile
import os
import dolomite_base
tmp = tempfile.mkdtemp()
path = os.path.join(tmp, "my_df")
dolomite_base.save_object(df, path)

# Reading from another location:
alt_path = os.path.join(tmp, "foobar")
os.rename(path, alt_path)
alt_out = dolomite_base.read_object(alt_path)

That said, it is unwise to manipulate the files inside the directory created by save_object(). Reading functions will usually depend on specific file names or subdirectory structures within the directory, and fiddling with them may cause unexpected results. Advanced users can exploit this by loading components from subdirectories if the full object is not required:

# Creating a nested DF:
nested = biocframe.BiocFrame({ "A": df })
nest_path = os.path.join(tmp, "nesting")
dolomite_base.save_object(nested, nest_path)

# Now reading in the nested DF:
redf = dolomite_base.read_object(os.path.join(nest_path, "other_columns", "0"))

Validating files

Each Bioconductor class's on-disk representation is determined by the associated takane specification. For example, save_object() will save a BiocFrame according to the data_frame specification. More complex objects may be represented by multiple files, possibly including subdirectories with "child" objects.

Each call to save_object() will automatically enforce the relevant specification by validating the directory contents with dolomite-base's validate_object() function. Successful validation provides some guarantees on the file structure within the directory, allowing developers to reliably implement readers in other frameworks. Conversely, the alabaster suite applies the same validators on directories generated within an R session, which ensures that dolomite-base is able to read those objects into a Python environment.

Users can also call validate_object() themselves, if they have modified the directory after calling save_object() and they want to check that the contents are still valid:

# Mocking up an object:
import biocframe
df = biocframe.BiocFrame({
    "X": list(range(0, 10)),
    "Y": [ "a", "b", "c", "d", "e", "f", "g", "h", "i", "j" ]
})

# Saving to one location:
import tempfile
import os
import dolomite_base
tmp = tempfile.mkdtemp()
path = os.path.join(tmp, "my_df")
dolomite_base.save_object(df, path)

# So far so good...
dolomite_base.validate_object(path)

# Deleting the file to make it invalid:
os.remove(os.path.join(path, "basic_columns.h5"))
dolomite_base.validate_object(path)
## Traceback (most recent call last):
## etc...

Extending to new classes

The dolomite framework is easily extended to new classes by:

  1. Writing a method for save_object(). This should accept an instance of the object and a path to a directory, and save the contents of the object inside the directory. It should also produce an OBJECT file that specifies the type of the object, e.g., data_frame, hdf5_sparse_matrix.
  2. Writing a function for read_object() and registering it in the read_object_registry. This should accept a path to a directory and read its contents to reconstruct the object. The registered type should be the same as that used in the OBJECT file.
  3. Writing a function for validate_object() and registering it in the validate_object_registry. This should accept a path to a directory and read its contents to determine if it is a valid on-disk representation. The registered type should be the same as that used in the OBJECT file.
    • (optional) Devleopers can alternatively formalize the on-disk representation by adding a specification to the takane repository. This aims to provide C++-based validators for each representation, allowing us to enforce consistency across multiple languages (e.g., R). Any takane validator is automatically used by validate_object() so no registration is required.

To illustrate, let's extend dolomite to a new custom class:

class Coffee:
    def __init__(self, beans: str, milk: bool):
        self.beans = beans
        self.milk = milk

First we implement the saving method. Note that we add a @validate_saves decorator to instruct save_object() to automatically run validate_object() on the directory by the Coffee method. This confirms that the output is valid according to our (yet to be added) validator method.

import dolomite_base
import os
import json

@dolomite_base.save_object.register
@dolomite_base.validate_saves
def save_object_for_Coffee(x: Coffee, path: str, **kwargs):
    os.mkdir(path)
    with open(os.path.join(path, "bean_type"), "w") as handle:
        handle.write(x.beans)
    with open(os.path.join(path, "has_milk"), "w") as handle:
        handle.write("true" if x.milk else "false")
    with open(os.path.join(path, "OBJECT"), "w") as handle:
        json.dump({ "type": "coffee", "coffee": { "version": "0.1" } }, handle)

Then we implement and register the reading method:

from typing import Dict

def read_Coffee(path: str, metadata: Dict, **kwargs) -> Coffee:
    metadata["coffee"]["version"] # possibly do something different based on version
    with open(os.path.join(path, "bean_type"), "r") as handle:
        beans = handle.read()
    with open(os.path.join(path, "has_milk"), "r") as handle:
        milk = (handle.read() == "true")
    return Coffee(beans, milk)

dolomite_base.read_object_registry["coffee"] = read_Coffee

And finally, the validation method:

def validate_Coffee(path: str, metadata: Dict):
    metadata["coffee"]["version"] # possibly do something different based on version
    with open(os.path.join(path, "bean_type"), "r") as handle:
        beans = handle.read()
        if not beans in [ "arabica", "robusta", "excelsa", "liberica" ]:
            raise ValueError("wrong bean type '" + beans + "'")
    with open(os.path.join(path, "has_milk"), "r") as handle:
        milk = handle.read()
        if not milk in [ "true", "false" ]:
            raise ValueError("invalid milk '" + milk + "'")

dolomite_base.validate_object_registry["coffee"] = validate_Coffee

Let's run them and see how it works:

cup = Coffee("arabica", milk=False)

import tempfile
tmp = tempfile.mkdtemp()
path = os.path.join(tmp, "stuff")
dolomite_base.save_object(cup, path)

cup2 = dolomite_base.read_object(path)
print(cup2.beans)
## arabica

For more complex objects that are composed of multiple smaller "child" objects, developers should consider saving each of their children in subdirectories of path. This can be achieved by calling alt_save_object() and alt_read_object() in the saving and loading functions, respectively. (We use the alt_* versions of these functions to respect application overrides, see below.)

Creating applications

Developers can also create applications that customize the machinery of the dolomite framework for specific needs. In most cases, this involves storing more metadata to describe the object in more detail. For example, we might want to remember the identity of the author for each object. This is achieved by creating an application-specific saving generic with the same signature as save_object():

from functools import singledispatch
from typing import Any, Dict, Optional
import dolomite_base
import json
import os
import getpass
import biocframe

def dump_extra_metadata(path: str, extra: Dict):
    user_id = getpass.getuser()
    # File names with leading underscores are reserved for application-specific
    # use, so they won't clash with anything produced by save_object().
    metapath = os.path.join(path, "_metadata.json")
    with open(metapath, "w") as handle:
        json.dump({ **extra, "author": user_id }, handle)

@singledispatch
def app_save_object(x: Any, path: str, **kwargs):
    dolomite_base.save_object(x, path, **kwargs) # does the real work
    dump_extra_metadata(path, {}) # adding some application-specific metadata

@app_save_object.register
def app_save_object_for_BiocFrame(x: biocframe.BiocFrame, path: str, **kwargs):
    dolomite_base.save_object(x, path, **kwargs) # does the real work
    # We can also override specific methods to add object+application-specific metadata:
    dump_extra_metadata(path, { "columns": x.get_column_names().as_list() })

In general, applications should avoid modifying the files created by the dolomite_base.save_object() call, to avoid violating any takane format specifications (unless the application maintainer really knows what they're doing). Applications are free to write to any path starting with an underscore as this will not be used by any specification.

Once a generic is defined, applications should call alt_save_object_function() to instruct alt_save_object() to use it instead of dolomite_base.save_object(). This ensures that the customizations are applied to all child objects, such as the nested BiocFrame below.

# Create a friendly user-visible function to perform the generic override; this
# is reversed on function exit to avoid interfering with other applications.
def save_for_application(x, path: str, **kwargs):
    old = dolomite_base.alt_save_object_function(app_save_object)
    try:
        dolomite_base.alt_save_object(x, path, **kwargs)
    finally:
        dolomite_base.alt_save_object_function(old)

# Saving our nested BiocFrames with our overrides active.
import biocframe
df = biocframe.BiocFrame({
    "A": [1, 2, 3, 4],
    "B": biocframe.BiocFrame({
        "C": ["a", "b", "c", "d"]
    })
})

import tempfile
tmp = tempfile.mkdtemp()
path = os.path.join(tmp, "foobar")
save_for_application(df, path)

# Both the parent and child BiocFrames have new metadata.
with open(os.path.join(path, "_metadata.json"), "r") as handle:
    print(handle.read())
## {"columns": ["A", "B"], "author": "aaron"}

with open(os.path.join(path, "other_columns", "1", "_metadata.json"), "r") as handle:
    print(handle.read())
## {"columns": ["C"], "author": "aaron"}

The reading function can be similarly overridden by setting alt_read_object_function() to instruct all alt_read_object() calls to use the override. This allows applications to, e.g., do something with the metadata that we just added.

def app_read_object(path: str, metadata: Optional[Dict] = None, **kwargs):
    if metadata is None:
        with open(os.path.join(path, "OBJECT"), "r") as handle:
            metadata = json.load(handle)

    # Print custom message based on the type and application-specific metadata.
    with open(os.path.join(path, "_metadata.json"), "r") as handle:
        appmeta = json.load(handle)
        print("I am a " + metadata["type"] + " created by " + appmeta["author"])
        if metadata["type"] == "data_frame":
            print("I have the following columns: " + ", ".join(appmeta["columns"]))

    return dolomite_base.read_object(path, metadata=metadata, **kwargs)

# Creating a user-friendly function to set the override before the read operation.
def read_for_application(path: str, metadata: Optional[Dict] = None, **kwargs):
    old = dolomite_base.alt_read_object_function(app_read_object)
    try:
        return dolomite_base.alt_read_object(path, metadata=metadata, **kwargs)
    finally:
        dolomite_base.alt_read_object_function(old)

# This diverts to the override with printing of custom messages.
read_for_application(path)
## I am a data_frame created by aaron
## I have the following columns: A, B
## I am a data_frame created by aaron
## I have the following columns: C

By overriding the saving and reading process for one or more classes, each application can customize the behavior of the dolomite framework to their own needs.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dolomite_base-0.3.0.tar.gz (57.3 kB view details)

Uploaded Source

Built Distributions

dolomite_base-0.3.0-cp312-cp312-musllinux_1_1_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.12 musllinux: musl 1.1+ x86-64

dolomite_base-0.3.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.6 MB view details)

Uploaded CPython 3.12 manylinux: glibc 2.17+ x86-64

dolomite_base-0.3.0-cp312-cp312-macosx_13_0_arm64.whl (1.7 MB view details)

Uploaded CPython 3.12 macOS 13.0+ ARM64

dolomite_base-0.3.0-cp312-cp312-macosx_11_0_x86_64.whl (2.1 MB view details)

Uploaded CPython 3.12 macOS 11.0+ x86-64

dolomite_base-0.3.0-cp311-cp311-musllinux_1_1_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.11 musllinux: musl 1.1+ x86-64

dolomite_base-0.3.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.6 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

dolomite_base-0.3.0-cp311-cp311-macosx_13_0_arm64.whl (1.7 MB view details)

Uploaded CPython 3.11 macOS 13.0+ ARM64

dolomite_base-0.3.0-cp311-cp311-macosx_11_0_x86_64.whl (2.1 MB view details)

Uploaded CPython 3.11 macOS 11.0+ x86-64

dolomite_base-0.3.0-cp310-cp310-musllinux_1_1_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.10 musllinux: musl 1.1+ x86-64

dolomite_base-0.3.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.6 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

dolomite_base-0.3.0-cp310-cp310-macosx_13_0_arm64.whl (1.7 MB view details)

Uploaded CPython 3.10 macOS 13.0+ ARM64

dolomite_base-0.3.0-cp310-cp310-macosx_11_0_x86_64.whl (2.1 MB view details)

Uploaded CPython 3.10 macOS 11.0+ x86-64

dolomite_base-0.3.0-cp39-cp39-musllinux_1_1_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.9 musllinux: musl 1.1+ x86-64

dolomite_base-0.3.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.6 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

dolomite_base-0.3.0-cp39-cp39-macosx_13_0_arm64.whl (1.7 MB view details)

Uploaded CPython 3.9 macOS 13.0+ ARM64

dolomite_base-0.3.0-cp39-cp39-macosx_11_0_x86_64.whl (2.1 MB view details)

Uploaded CPython 3.9 macOS 11.0+ x86-64

dolomite_base-0.3.0-cp38-cp38-musllinux_1_1_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.8 musllinux: musl 1.1+ x86-64

dolomite_base-0.3.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.6 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

dolomite_base-0.3.0-cp38-cp38-macosx_13_0_arm64.whl (1.7 MB view details)

Uploaded CPython 3.8 macOS 13.0+ ARM64

dolomite_base-0.3.0-cp38-cp38-macosx_11_0_x86_64.whl (2.1 MB view details)

Uploaded CPython 3.8 macOS 11.0+ x86-64

File details

Details for the file dolomite_base-0.3.0.tar.gz.

File metadata

  • Download URL: dolomite_base-0.3.0.tar.gz
  • Upload date:
  • Size: 57.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.10

File hashes

Hashes for dolomite_base-0.3.0.tar.gz
Algorithm Hash digest
SHA256 f1ff2f717ac745bf609eadf0f3390ad8ff78d106ebd309bae5ffdecaea3ddadf
MD5 cf0857eee52904d5afdc58e5a547d148
BLAKE2b-256 63b8256b70e6c76bc187780682bd5768fcd291767fea83a43146c0c2100d2417

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp312-cp312-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp312-cp312-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 e6541c896c1c3da1d01fd686bd3785c4c4d83c1b5df8d02fc7fed9d94ffe92d0
MD5 add99ff21e3d520eac94d17353f0a8fb
BLAKE2b-256 06cdae720cdb94fd373d7a7f6ba8370c9ff9319d285b057d11210bcd2724194b

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 e7b3e35aebe5c30050b4c09321585b817f409b0675d39dfb1cbd555b8bad423c
MD5 29280d993dd94028d5e111a8d61342ce
BLAKE2b-256 1dc4916a21afa430a8c00c1fc77fb35e293451649bcd9fe1b08674f274ee1efe

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp312-cp312-macosx_13_0_arm64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp312-cp312-macosx_13_0_arm64.whl
Algorithm Hash digest
SHA256 3f0f5e649c4bcd16aa4404c586edbef946d3dd89204152c43869f928908a34ca
MD5 190bbe985e011fe0128034ea9532ea15
BLAKE2b-256 53851f9e77433f9f7ab2dfee5c00444b20e3915f1f594ce62de954087b8814d7

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp312-cp312-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp312-cp312-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 9d21378121e1b978b062da7b0a32f74f49adf1fe0166168da8709d74dcf21981
MD5 66820db331a7e5cb4fd530c2dc7e7938
BLAKE2b-256 02f0cbb0594484aba8e1fb478c1412f90af5ba17a1103262bd664e67513e5e89

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp311-cp311-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp311-cp311-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 f33d9a0adf364c610ac4c1e3059513723b1ff11ab373a44575ab62bf4df15ebd
MD5 a0876517eeb13b289d5967340b63a9c6
BLAKE2b-256 3c1cb2b6d04e5ecab353aa249645fbf4f20d645236ac78b9aa0f3ba24542bea6

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f725b3b240b29c93d2590281cb00a225d44b3e416b233e85263c05142ed3aed7
MD5 64ed48be9cfae0ccc44de9da42008bbb
BLAKE2b-256 8518365c2ca384663774fb2212e858e4f74640e0b492840b21163923065b029e

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp311-cp311-macosx_13_0_arm64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp311-cp311-macosx_13_0_arm64.whl
Algorithm Hash digest
SHA256 c9a88ff0480d4ac3fb8f85ed9280b497c1c3f94b192f6995388b78c3366ea6b0
MD5 57d93b53acb1e312475a01b53ca84ca9
BLAKE2b-256 0d99868f04f1a930f1b324fef821df67fe0ade25fb07b4bbef4246547c547b75

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp311-cp311-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp311-cp311-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 d1fb425c19c792442cae1a3e35e06cd668be3eaf6c7a886bcbb2dd434eb7f0de
MD5 6c6576f056776850858b0d05feaa8a25
BLAKE2b-256 16aeb540bc1e08d56ed2e82090a384c203de4dddef6feea7fe4737cae7e609ee

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp310-cp310-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp310-cp310-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 18b61be932dfde7f6097df112d0a5791a8dd898b76d79ef6f0f155adc4af53fb
MD5 466d682136f003ef32ff86d541700f7d
BLAKE2b-256 bf921908a273d0f6ebd6a061ac22af241ed8ad18b03b8265c1a46ee36b929fab

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 ed87b998a972794a09bfca7ca6c879871751f18a28c2d44cfae457c20f07cfc7
MD5 4e3176ebef78a1e8e23bfe937773e5a6
BLAKE2b-256 4840f325bb0f5e77374f8f49c599ec30339e8fc4e5929440243355c1a320b623

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp310-cp310-macosx_13_0_arm64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp310-cp310-macosx_13_0_arm64.whl
Algorithm Hash digest
SHA256 105a4f9b0c4c5f18ab4d6e1cdbb56125bd203d71dc7ac8865bc7ab3d7420d304
MD5 af1ec7728006cb5d3d03365723d6fbd6
BLAKE2b-256 786d52f9403dd230816dd48cd69efb0bb07128847767fa3f0621d22525a285ee

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp310-cp310-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp310-cp310-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 d4473140cea0fb851c2c98eb5db327274179294571a1998eadd9b6e4079ac37a
MD5 fe43835fe3945e02062231e3287feab7
BLAKE2b-256 87693f334e665cefd46f53850d7da993670d70facc279a164a7d18ae8046836f

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp39-cp39-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp39-cp39-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 51592728f387a9516f049b3a9a153045286483241d8aa1fdbb05b613a1beed3c
MD5 16fd9c47ebc09a9c499dc797c24227cc
BLAKE2b-256 9ed1cc9c107b485998b211571cc12ea0623f7a49f48beeca152bd763b19baca2

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f24c08941eed5cf1378b942ede65dacb1229752cf8b065d8c98ed4b500a3be35
MD5 2c40acd882487f2b2650fe2fbd516e7c
BLAKE2b-256 8fd35cba4160553ace73df6edd03fbd6f868c6b81c2056fe6010a1765cea3055

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp39-cp39-macosx_13_0_arm64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp39-cp39-macosx_13_0_arm64.whl
Algorithm Hash digest
SHA256 191965d02d4708200579f58e788fd3d6af09079ea7d8568e1be1b66ad52357f1
MD5 8a6aa02ca53121f9160488ec9a5d8031
BLAKE2b-256 0ff664f82d9ba4caf6802dad56e24b77a32ccdd950a20fb1ca9074b249b276ad

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp39-cp39-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp39-cp39-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 a9ce5a520ab71ce644f02cdbadb14ab6409657df6e01e8f0c5091b97721f3138
MD5 c2b4fdc718ebb88be813c464e82efa44
BLAKE2b-256 67f659388cab1a09f6785a6f593496043f6a0407f958bb56fb68efa678ea944d

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp38-cp38-musllinux_1_1_x86_64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp38-cp38-musllinux_1_1_x86_64.whl
Algorithm Hash digest
SHA256 d9e87498736120b42e70cbca0e9148f534e2b395616ced3d538e8f8214c7888e
MD5 3150831b41a2d2a7b3b9bb547631a5e4
BLAKE2b-256 9721f0d9eef93b9170225d90e7260fbbec9e76b218cc08d044856d010a407189

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 01fb39e5ea64a476fa6d9816c825ff197f081223aebfe63dc22ad94c6a2edb0b
MD5 f23690a74d6a9171307c403c0816a503
BLAKE2b-256 41a608a04269595c11f4b97aa7560637aacbaf41f8dbc8343a2566102dd7b570

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp38-cp38-macosx_13_0_arm64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp38-cp38-macosx_13_0_arm64.whl
Algorithm Hash digest
SHA256 20517b8655c2a2872452a5a3e4f9d9b2d8ee668fa84ac8468a86d487e629d9d5
MD5 ac7cf245c8ae620d36e6f897dbf98fe3
BLAKE2b-256 b7f0716ae41192768d2a5a455a40041b6c6b28016dfbd23b9a8d0bf2bcabc856

See more details on using hashes here.

File details

Details for the file dolomite_base-0.3.0-cp38-cp38-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for dolomite_base-0.3.0-cp38-cp38-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 6336369aa168fc3fa5f532fb82eae1c071220a61d35ecd001968edcf34e166b7
MD5 7ef66c06cc344f1b645c5612f2f27d34
BLAKE2b-256 0207e7ffe8a0a620eb58ca847ee731c1872a2de18f6a02bf7f33a131a57b7029

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page