Skip to main content

Toolbox for analysis on segmented images from MIBI.

Project description

ark-analysis

CI / CD CI Read the Docs Coverage Status Docker Image Version (latest by date)
Package PyPI - Version PyPI - Downloads PyPI - Python Version
Meta PyPI - License

Toolbox for analyzing multiplexed imaging data.

Full documentation for the project can be found here.

Table of Contents

Pipeline Flowchart

Getting Started

Overview

This repo contains tools for analyzing multiplexed imaging data. The assumption is that you've already performed any necessary image processing on your data (such as denoising, background subtraction, autofluorescence correction, etc), and that it is ready to be analyzed. For MIBI data, we recommend using the toffy processing pipeline.

We have recorded workshop talks which complement the repository. MIBI Workshop Playlist.

1. Segmentation

The segmentation notebook will walk you through the process of using Mesmer to segment your image data. This includes selecting the appropriate channel(s) for segmentation, running your data through the network, and then extracting single-cell statistics from the resulting segmentation mask. Workshop Talk - Session V - Part 1: Segmentation

  • Note: It is assumed that the cell table uses the default column names as in ark/settings.py. Refer to the docs to get descriptions of the cell table columns, and methods to adjust them if necessary.
  • If you plan to segment out non-traditional cellular structures such as protein aggregates or cytoplasmic projections often found in brain cells (e.g. microglia, astrocytes, and neuropil), try out the companion notebook ezSegmenter either as a stand-alone or in combination with the above standard cell segmentation process.

2. Pixel clustering with Pixie

The first step in the Pixie pipeline is to run the pixel clustering notebook. The notebook walks you through the process of generating pixel clusters for your data, and lets you specify what markers to use for the clustering, train a model, use it to classify your entire dataset, and generate pixel cluster overlays. The notebook includes a GUI for manual cluster adjustment and annotation. Workshop Talk - Session IV - Pixel Level Analysis

3. Cell clustering with Pixie

The second step in the Pixie pipeline is to run the cell clustering notebook. This notebook will use the pixel clusters generated in the first notebook to cluster the cells in your dataset. The notebook walks you through generating cell clusters for your data and generates cell cluster overlays. The notebook includes a GUI for manual cluster adjustment and annotation. Workshop Talk - Session V - Cell-level Analysis - Part 2: Cell Clustering

4. Post Clustering Tasks

After the Pixie Pipeline, the user can inspect and fine tune their results with the post clustering notebook. This notebook will go over cleaning up artifacts left from clustering, and working with functional markers.

5. Spatial Analysis

Workshop Talk - Session VI - Spatial Analysis - Part 1: Choosing the Right Analysis Tool.

  1. Pairwise Enrichment Analysis

    The pairwise enrichment notebook allows the user to investigate the interaction between the phenotypes present in their data. In addition users can cluster based on phenotypes around a particular feature such as artery or gland. Workshop Talk - Session VI - Spatial Analysis - Part 2: Pairwise Spatial Enrichment.

  2. K-means Neighborhood Analysis

    The neighborhood analysis notebook sheds light on neighborhoods made of micro-environments which consist of a collection of cell phenotypes. Workshop Talk - Session VI - Spatial Analysis - Part 3: K-means Neighborhood Analysis.

  3. Spatial LDA

    The preprocessing and training / inference draws from language analysis, specifically topic modelling. Spatial LDA overlays a probability distribution on cells belonging to a any particular micro-environment. Workshop Talk - Session VI - Spatial Analysis - Part 4: Spatial LDA.

Installation Steps

Pip Installation

You can install the latest version of ark with:

pip install ark-analysis

However, the repository will still need to be cloned if you wish to use the Jupyter Notebooks.

Download the Repo

We recommend using the latest release of ark. You can find all the versions available in the Releases Section. Open terminal and navigate to where you want the code stored.

If you would like to use the latest version of ark simply clone the project and create the Conda environment.

git clone -b v0.7.0 https://github.com/angelolab/ark-analysis.git
cd ark-analysis
conda env create -f environment.yml

Setting up Docker

There is a complementary setup video.

Next, you'll need to download Docker Desktop:

  • First, download Docker Desktop.
  • Once it's successfully installed, make sure it is running by looking in toolbar for the Docker whale icon.

Running on Windows

Our repo runs best on Linux-based systems (including MacOS). If you need to run on Windows, please consult our Windows guide for additional instructions.

Using the Repository (Running the Docker)

Enter the following command into terminal from the same directory you ran the above commands:

./start_docker.sh

If running for the first time, or if our Docker image has updated, it may take a while to build and setup before completion.

This will generate a link to a Jupyter notebook. Copy the last URL (the one with 127.0.0.1:8888 at the beginning) into your web browser.

Be sure to keep this terminal open. Do not exit the terminal or enter control-c until you are finished with the notebooks.

NOTE:

If you already have a Jupyter session open when you run ./start_docker.sh, you will receive a couple additional prompts.

Copy the URL listed after Enter this URL instead to access the notebooks:

You will need to authenticate. Note the last URL (the one with 127.0.0.1:8888 at the beginning), copy the token that appears there (it will be after token= in the URL), paste it into the password prompt of the Jupyter notebook, and log in.

You can shut down the notebooks and close docker by entering control-c in the terminal window.

REMEMBER TO DUPLICATE AND RENAME NOTEBOOKS

If you didn't change the name of the notebooks within the templates folder, they will be overwritten when you decide to update the repo. Read about updating Ark here

External Tools

Mantis Viewer

Mantis is a multiplexed image viewer developed by the Parker Institute. It has built in functionality for easily viewing multichannel images, creating overlays, and concurrently displaying image features alongisde raw channels. We have found it to be extremely useful for analying the output of our analysis pipeline. There are detailed instructions on their download page for how to install and use the tool. Below are some details specifically related to how we use it in ark. Workshop Talk - Session V - Cell-level Analysis - Part 3: Assessing Accuracy with Mantis Viewer.

Mantis directory structure

Mantis expects image data to have a specific organization in order to display it. It is quite similar to how MIBI data is already stored, with a unique folder for each FOV and all channels as individual tifs within that folder. Any notebooks that suggest using Mantis Viewer to inspect results will automatically format the data in the format shown below.

mantis
│ 
├── fov0
│   ├── cell_segmentation.tiff
│   ├── chan0.tiff
│   ├── chan1.tiff
│   ├── chan2.tiff
│   ├── ...
│   ├── population_mask.csv
│   └── population_mask.tiff
├── fov1
│   ├── cell_segmentation.tiff
│   ├── chan0.tiff
│   ├── chan1.tiff
│   ├── chan2.tiff
│   ├── ...
│   ├── population_mask.csv
│   └── population_mask.tiff
└── marker_counts.csv

Loading image-specific files

In addition to the images, there are additional files in the directory structure which can be read into mantis.

cell_segmentation: This file contains the predicted segmentation for each cell in the image, and allows mantis to identify individual cells.

population_pixel_mask: This file maps the individual pixel clusters generated by Pixie in the pixel clustering notebook to the image data.

population_cell_mask: Same as above, but for cell clusters instead of pixel clusters

These files should be specified when first initializing a project in mantis as indicated below:

Loading project-wide files

When inspecting the output of the clustering notebooks, it is often useful to add project-wide .csv files, such as marker_counts.csv. These files contain information, such as the average expression of a given marker, across all the cells in the project. Project-wide files can either be loaded at project initialization, as shown below:

Or they can be loaded into an existing project via Import -> Segment Features -> For project from CSV

View cell features

Once you have loaded the project-wide files into Mantis, you'll need to decide which of the features you want to view. Click on Show Plot Plane at the bottom right, then select the marker you want to assess. This will then allow you to view the cell expression of that marker when you mouse over the cell in Mantis.

External Hard Drives and Google File Stream

To configure external hard drive (or google file stream) access, you will have to add this to Dockers file paths in the Preferences menu.

On Docker for macOS, this can be found in Preferences -> Resources -> File Sharing. Adding /Volumes will allow docker to see external drives

On Docker for Windows with the WSL2 backend, no paths need to be added. However, if using the Hyper-V backend, these paths will need to be added as in the macOS case.

Once the path is added, you can run:

bash start_docker.sh --external 'path/added/to/preferences'

or

bash start_docker.sh -e 'path/added/to/preferences'

to mount the drive into the virtual /data/external path inside the docker.

Updating the Repository

This project is still under development, and we are making frequent changes and improvements. If you want to update the version on your computer to have the latest changes, perform the following steps. Otherwise, we recommend waiting for new releases.

First, get the latest version of the repository.

git pull

Then, run the command below to update the Jupyter notebooks to the latest version

./start_docker.sh --update

or

./start_docker.sh -u

If you have made changes to these notebooks that you would like to keep (specific file paths, settings, custom routines, etc), rename them before updating!

For example, rename your existing copy of 1_Segment_Image_Data.ipynb to 1_Segment_Image_Data_old.ipynb. Then, after running the update command, a new version of 1_Segment_Image_Data.ipynb will be created with the newest code, and your old copy will exist with the new name that you gave it.

After updating, you can copy over any important paths or modifications from the old notebooks into the new notebook.

Example Dataset

If you would like to test out the pipeline, then we have incorporated an example dataset within the notebooks. Currently the dataset contains 11 FOVs with 22 channels (CD3, CD4, CD8, CD14, CD20, CD31, CD45, CD68, CD163, CK17, Collagen1, ECAD, Fibronectin, GLUT1, H3K9ac, H3K27me3, HLADR, IDO, Ki67, PD1, SMA, Vim), and intermediate data necessary for each notebook in the pipeline.

The dataset is split into several smaller components, with each Jupyter Notebook using a combination of those components. We utilize Hugging Face for storing the dataset and using their API's for creating these configurations. You can view the dataset's repository as well.

Dataset Compartments

Image Data: This compartment stores the tiff files for each channel, for every FOV.

image_data/
├── fov0/
│  ├── CD3.tiff
│  ├── ...
│  └── Vim.tiff
├── fov1/
│  ├── CD3.tiff
│  ├── ...
│  └── Vim.tiff
├── .../

Cell Table: This compartment stores the various cell tables which get generated by Notebook 1.

segmentation/cell_table/
├── cell_table_arcsinh_transformed.csv
├── cell_table_size_normalized.csv
└── cell_table_size_normalized_cell_labels.csv

Deepcell Output: This compartment stores the segmentation images after running deepcell.

segmentation/deepcell_output/
├── fov0_whole_cell.tiff
├── fov0_nuclear.tiff
├── ...
├── fov10_whole_cell.tiff
└── fov10_nuclear.tiff

Example Pixel Output: This compartment stores feather files, csvs and pixel masks generated by pixel clustering.

segmentation/example_pixel_output_dir/
├── cell_clustering_params.json
├── channel_norm.feather
├── channel_norm_post_rowsum.feather
├── pixel_thresh.feather
├── pixel_channel_avg_meta_cluster.csv
├── pixel_channel_avg_som_cluster.csv
├── pixel_masks/
│  ├── fov0_pixel_mask.tiff
│  └── fov1_pixel_mask.tiff
├── pixel_mat_data/
│  ├── fov0.feather
│  ├── ...
│  └── fov10.feather
├── pixel_mat_subset/
│  ├── fov0.feather
│  ├── ...
│  └── fov10.feather
├── pixel_meta_cluster_mapping.csv
└── pixel_som_weights.feather

Example Cell Output: This compartment stores feather files, csvs and cell masks generated by cell clustering.

segmentation/example_cell_output_dir/
├── cell_masks/
│  ├── fov0_cell_mask.tiff
│  └── fov1_cell_mask.tiff
├── cell_meta_cluster_channel_avg.csv
├── cell_meta_cluster_count_avg.csv
├── cell_meta_cluster_mapping.csv
├── cell_som_cluster_channel_avg.csv
├── cell_som_cluster_count_avg.csv
├── cell_som_weights.feather
├── cluster_counts.feather
├── cluster_counts_size_norm.feather
└── weighted_cell_channel.csv

Dataset Configurations

  • 1 - Segment Image Data:
    • Image Data
  • 2 - Pixie Cluster Pixels:
    • Image Data
    • Cell Table
    • Deepcell Output
  • 3 - Pixie Cluster Cells:
    • Image Data
    • Cell Table
    • Deepcell Output
    • Example Pixel Output
  • 4 - Post Clustering:
    • Image Data
    • Cell Table
    • Deepcell Output
    • Example Cell Output

Questions?

If you have a general question or are having trouble with part of the repo, you can refer to our FAQ or head to the discussions tab to get help. If you've found a bug with the codebase, first make sure there's not already an open issue, and if not, you can then open an issue describing the bug.

Want to contribute?

If you would like to help make ark better, please take a look at our contributing guidelines.

How to Cite

Please directly cite the ark repo (https://github.com/angelolab/ark-analysis) if it was a part of your analysis. In addition, please cite the relevant paper(s) below where applicable to your study.

  1. Greenwald, Miller et al. Whole-cell segmentation of tissue images with human-level performance using large-scale data annotation and deep learning [2021]
  2. Liu et al. Robust phenotyping of highly multiplexed tissue imaging data using pixel-level clustering [2023]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ark-analysis-0.7.0.tar.gz (8.1 MB view details)

Uploaded Source

Built Distributions

ark_analysis-0.7.0-cp311-cp311-win_arm64.whl (210.3 kB view details)

Uploaded CPython 3.11 Windows ARM64

ark_analysis-0.7.0-cp311-cp311-win_amd64.whl (222.0 kB view details)

Uploaded CPython 3.11 Windows x86-64

ark_analysis-0.7.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl (727.4 kB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64 manylinux: glibc 2.28+ x86-64

ark_analysis-0.7.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl (724.7 kB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ ARM64 manylinux: glibc 2.28+ ARM64

ark_analysis-0.7.0-cp311-cp311-macosx_11_0_arm64.whl (348.7 kB view details)

Uploaded CPython 3.11 macOS 11.0+ ARM64

ark_analysis-0.7.0-cp311-cp311-macosx_10_9_x86_64.whl (356.8 kB view details)

Uploaded CPython 3.11 macOS 10.9+ x86-64

ark_analysis-0.7.0-cp311-cp311-macosx_10_9_universal2.whl (423.6 kB view details)

Uploaded CPython 3.11 macOS 10.9+ universal2 (ARM64, x86-64)

ark_analysis-0.7.0-cp310-cp310-win_arm64.whl (210.6 kB view details)

Uploaded CPython 3.10 Windows ARM64

ark_analysis-0.7.0-cp310-cp310-win_amd64.whl (222.9 kB view details)

Uploaded CPython 3.10 Windows x86-64

ark_analysis-0.7.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl (705.4 kB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64 manylinux: glibc 2.28+ x86-64

ark_analysis-0.7.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl (702.4 kB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ ARM64 manylinux: glibc 2.28+ ARM64

ark_analysis-0.7.0-cp310-cp310-macosx_11_0_arm64.whl (349.6 kB view details)

Uploaded CPython 3.10 macOS 11.0+ ARM64

ark_analysis-0.7.0-cp310-cp310-macosx_10_9_x86_64.whl (357.7 kB view details)

Uploaded CPython 3.10 macOS 10.9+ x86-64

ark_analysis-0.7.0-cp310-cp310-macosx_10_9_universal2.whl (425.6 kB view details)

Uploaded CPython 3.10 macOS 10.9+ universal2 (ARM64, x86-64)

ark_analysis-0.7.0-cp39-cp39-win_amd64.whl (223.9 kB view details)

Uploaded CPython 3.9 Windows x86-64

ark_analysis-0.7.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl (709.5 kB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64 manylinux: glibc 2.28+ x86-64

ark_analysis-0.7.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl (706.4 kB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ ARM64 manylinux: glibc 2.28+ ARM64

ark_analysis-0.7.0-cp39-cp39-macosx_11_0_arm64.whl (350.0 kB view details)

Uploaded CPython 3.9 macOS 11.0+ ARM64

ark_analysis-0.7.0-cp39-cp39-macosx_10_9_x86_64.whl (358.1 kB view details)

Uploaded CPython 3.9 macOS 10.9+ x86-64

ark_analysis-0.7.0-cp39-cp39-macosx_10_9_universal2.whl (426.3 kB view details)

Uploaded CPython 3.9 macOS 10.9+ universal2 (ARM64, x86-64)

File details

Details for the file ark-analysis-0.7.0.tar.gz.

File metadata

  • Download URL: ark-analysis-0.7.0.tar.gz
  • Upload date:
  • Size: 8.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.7

File hashes

Hashes for ark-analysis-0.7.0.tar.gz
Algorithm Hash digest
SHA256 3ca0b633ddbb3ef90913457dec07d9370174d9391c0e0da822fbb1c270e8adff
MD5 0ffa1921334eb48d80bd7b52f625ae7b
BLAKE2b-256 f5081c44d55b72fc04444072b70d7d19ef3be411705074c70fa099a1e97f7cde

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp311-cp311-win_arm64.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp311-cp311-win_arm64.whl
Algorithm Hash digest
SHA256 a3aedac51b5485561d1109ad2aeb62621f66407e1a2b24afe194c30779801665
MD5 7aacb90d30d9c6269ead4c9fdd614693
BLAKE2b-256 e708be7e7fc9a6d30e0fa215fa1655097871459d5c6e274bcaa7e6859bef5808

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 9af8022562f02d254852d50dd8388f5ee18df8311c00a04f0666613fc735882d
MD5 b1f360ef67e255c926f716fc74f72de8
BLAKE2b-256 598d9877e7e778c2eef00a67fe8af35d8f422e0d51d598b2959a8f1d7e7dfe22

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 5148056d96f9022e437fe74be0ba68c670e93424589d10bd40d8e0c999e9d4e7
MD5 aaf63ff049361d4569d6f2f15e103d76
BLAKE2b-256 1e0b71ef579f5f93acb387dc6dc80e5a087778b65825bedd3e69e050eb6f964b

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 14743e77fb276fa7eedeb7c6145d6610e8c729d5d2100da82b922b584abfeeea
MD5 18c1d4fed42b4a517f4abcfbde98aa18
BLAKE2b-256 8c37198dc0d5b63ed1005195fc1c9f33614b7c644cc4fda02f0fc3a5554e8caf

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 ad2f4f3e3e93e767f70320e6708ffcc230085d38ccbaa9796ef3fbd33656ac08
MD5 2fd91bcc52227cb1920bec496673809d
BLAKE2b-256 5b38b29f30132391718e1e3dd05b01b9fc1b1fd553f00f3fb9d52651fde05dd8

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 2cf2788ea34603297f908847fdd23a98c03cb038350e33bf832b37c4026b31e8
MD5 2d44a99fc3dce8574982e6965fb19df4
BLAKE2b-256 4c48c5bbc773ffcd40095c3fc9195eb51fbc71270879ee5a615fd58e9a5e723e

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp311-cp311-macosx_10_9_universal2.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp311-cp311-macosx_10_9_universal2.whl
Algorithm Hash digest
SHA256 b05f79fd0671f3d4fcdf164080f3cbd6c538226dcaccbc0fb2d7fbfce2ee944e
MD5 49778ec68a81fc85060bc7511f75c590
BLAKE2b-256 1fb413e941eabf50b144ba9c8c85caf9d8e4c94efa06c070c2552932f642e104

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp310-cp310-win_arm64.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp310-cp310-win_arm64.whl
Algorithm Hash digest
SHA256 3cd15a3ac7dcb201126930126abaff9904a887f9eb05bb06b2edd514729c67a2
MD5 570a6445270c40212e3f20637dba8e21
BLAKE2b-256 cbb442fe07d3fdb18cde210e48fe57d9c181dba080a9ad196f45dd708e33dde0

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 64dc18c9a9066797053cda3a9fa2a8b345832867ca0cd51d0dabe261b775b1f4
MD5 4bbcfd36d5d84d13e97bc2dd85ee6e28
BLAKE2b-256 0b79f44e12fc741e313724686a1746ea3d11fbd1223b924ce9684bc1884d2788

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 895c0fb565fca77157c3725e8e5967741095ca55ba75adc295a24db25231f609
MD5 11594e727bb1b1e1541ac3264bd620af
BLAKE2b-256 cf45327e75077f3c3d9543f2d8434650dd8774fd1a1614caf29c44d11958e2de

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 54823d05828c7ccbd0ff6e2304ddf0fa8b2a1f585569d36d39a5c22a0b481e4d
MD5 121d818bf00671af27b1e8482301ac16
BLAKE2b-256 6c9ac2957696b29e2168a9a36ac244fb6a62c9d5107214cb9de80143526240c9

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 c7e86789abfce04387b3ce280dc4eff7bc942f248217846c43d069081a4d94c2
MD5 51e5da838921b275ae9cc50b499ed89b
BLAKE2b-256 14738e70341e359b5631df1fb46e55df7434a59f128fdc658661394b1e3aa6ae

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 1d22f5d9c190a0e7ee45ebc1a7cd053eecab7f8d0f7d0d9b3c580ce44787eab2
MD5 2fe4e89b2d5f722a58ef64a85d0ddf0e
BLAKE2b-256 2f53cdf68fb8293d2a1f52c3d0b8e8430b79e9a4c5ed6432a50008c7573fb59e

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp310-cp310-macosx_10_9_universal2.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp310-cp310-macosx_10_9_universal2.whl
Algorithm Hash digest
SHA256 79e6476a58f18a5b49493fa5b08dec71333a1b3b12c592d812b5f14e5f6869ef
MD5 759968cbfd3f736d094f1ee1f600f4cf
BLAKE2b-256 0bbb04af630bae4580c4360c01b73f7beb9480bc8e64f8afcc3cf18ac0dfeb08

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 a37d5554c9173edd2c27160355e79665b1fbdf2396c44ff61655f3b19f222dc6
MD5 45ee21d29a11cf751aeafb22e1d308bb
BLAKE2b-256 c652d8de783d07339f5f30f4c58a88efb3f37622de2ef41453c4ebc84d2e78b7

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 6324d1e8b3b7d37255befb0f0cf65a14124ad5f22a3ca56b57323bfea022f2e0
MD5 b9909c683afadf229d3bd4c593980702
BLAKE2b-256 6f5ddf2d0ead28d713bc5cd5002c95154b985173ee4e2e0a3c987beb07e7bcfd

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 b0bd64d58bfc26f5c0169ee74c81f5d81a055396ff8ba672be0546fb478680c2
MD5 b34f3c300d04c5a28b862ce2795b8514
BLAKE2b-256 27f465464bf46fe1bca46888358c3673f05f1c25a9a8536f96981a2a505d9c89

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 a9d0cf72be8a45c393744bcccf0d461cd6c9b259f955ec646267c51239b52a07
MD5 27a9b91eacd7419973ecd9e06154d502
BLAKE2b-256 f0417773667c590a974faaddea1f12b6cb31de4704165df99e086304d5044b3f

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp39-cp39-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp39-cp39-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 53b1e3270cd07c876720c5e48ba745a287c984188355ee9b2e3288cb538f3e50
MD5 8f01024c1594528d8f3f007401cbba80
BLAKE2b-256 18d4eb8050aa4c60dc96114ed49fd97633388dda3027e76ef2b6ac54c1be428e

See more details on using hashes here.

File details

Details for the file ark_analysis-0.7.0-cp39-cp39-macosx_10_9_universal2.whl.

File metadata

File hashes

Hashes for ark_analysis-0.7.0-cp39-cp39-macosx_10_9_universal2.whl
Algorithm Hash digest
SHA256 9c312154edd97c0c14352e2059dc998dea8b0931608b8c308e6ae6f26a1301df
MD5 1a10e8d173af3d5c3aaf50b0cd7f2421
BLAKE2b-256 6c2878b7ac0d612a2e3675294fb325a1740e409aa7368d9db4ad0ddae6c4ff80

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page