Skip to main content

Internal S3 client implementation for s3torchconnector

Project description

Amazon S3 Connector for PyTorch

The Amazon S3 Connector for PyTorch delivers high throughput for PyTorch training jobs that access or store data in Amazon S3. Using the S3 Connector for PyTorch automatically optimizes performance when downloading training data from and writing checkpoints to Amazon S3, eliminating the need to write your own code to list S3 buckets and manage concurrent requests.

Amazon S3 Connector for PyTorch provides implementations of PyTorch's dataset primitives that you can use to load training data from Amazon S3. It supports both map-style datasets for random data access patterns and iterable-style datasets for streaming sequential data access patterns. The S3 Connector for PyTorch also includes a checkpointing interface to save and load checkpoints directly to Amazon S3, without first saving to local storage.

Getting Started

Prerequisites

  • Python 3.8-3.14 is supported.
    • Note: Python 3.8 support will be deprecated in a future release, see #399
  • PyTorch >= 2.0 (TODO: Check with PyTorch 1.x)

Installation

pip install s3torchconnector

Amazon S3 Connector for PyTorch supports pre-built wheels via Pip only for Linux and MacOS for now. (Note: macOS x86_64 wheel support will be deprecated in a future release, see #398) For other platforms, see DEVELOPMENT for build instructions.

Configuration

To use s3torchconnector, AWS credentials must be provided through one of the following methods:

  • EC2 Instance Role: If you are using this library on an EC2 instance, specify an IAM role and then give the EC2 instance access to that role.
  • AWS CLI: Install and configure awscli and run aws configure.
  • AWS Credential Files: Set credentials in the AWS credentials profile file on the local system, located at: ~/.aws/credentials on Unix or macOS.
  • Environment Variables: Set the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables.

To use a specific AWS profile configured in ~/.aws/config and ~/.aws/credentials, you can either:

  • Set environment variable AWS_PROFILE=custom-profile, or
  • Pass the profile name to the S3ClientConfig object, e.g. S3ClientConfig(profile="custom-profile").

For a more detailed configuration guide, see AWS CLI docs.

Examples

API docs are showing API of the public components. End to end example of how to use s3torchconnector can be found under the examples directory.

Sample Examples

The simplest way to use the S3 Connector for PyTorch is to construct a dataset, either a map-style or iterable-style dataset, by specifying an S3 URI (a bucket and optional prefix) and the region the bucket is located in:

from s3torchconnector import S3MapDataset, S3IterableDataset

# You need to update <BUCKET> and <PREFIX>
DATASET_URI="s3://<BUCKET>/<PREFIX>"
REGION = "us-east-1"

iterable_dataset = S3IterableDataset.from_prefix(DATASET_URI, region=REGION)

# Datasets are also iterators. 
for item in iterable_dataset:
  print(item.key)

# S3MapDataset eagerly lists all the objects under the given prefix 
# to provide support of random access.  
# S3MapDataset builds a list of all objects at the first access to its elements or 
# at the first call to get the number of elements, whichever happens first.
# This process might take some time and may give the impression of being unresponsive.
map_dataset = S3MapDataset.from_prefix(DATASET_URI, region=REGION)

# Randomly access an item in map_dataset.
item = map_dataset[0]

# Learn about bucket, key, and content of the object
bucket = item.bucket
key = item.key
content = item.read()
len(content)

In addition to data loading primitives, the S3 Connector for PyTorch also provides an interface for saving and loading model checkpoints directly to and from an S3 bucket.

from s3torchconnector import S3Checkpoint

import torchvision
import torch

CHECKPOINT_URI="s3://<BUCKET>/<KEY>/"
REGION = "us-east-1"
checkpoint = S3Checkpoint(region=REGION)

model = torchvision.models.resnet18()

# Save checkpoint to S3
with checkpoint.writer(CHECKPOINT_URI + "epoch0.ckpt") as writer:
    torch.save(model.state_dict(), writer)

# Load checkpoint from S3
with checkpoint.reader(CHECKPOINT_URI + "epoch0.ckpt") as reader:
    state_dict = torch.load(reader)

model.load_state_dict(state_dict)

Using datasets or checkpoints with Amazon S3 Express One Zone directory buckets requires only to update the URI, following base-name--azid--x-s3 bucket name format. For example, assuming the following directory bucket name my-test-bucket--usw2-az1--x-s3 with the Availability Zone ID usw2-az1, then the URI used will look like: s3://my-test-bucket--usw2-az1--x-s3/<PREFIX> (please note that the prefix for Amazon S3 Express One Zone should end with '/'), paired with region us-west-2.

Distributed checkpoints

Overview

Amazon S3 Connector for PyTorch provides robust support for PyTorch distributed checkpoints. This feature includes:

  • S3StorageWriter: Implementation of PyTorch's StorageWriter interface.

  • S3StorageReader: Implementation of PyTorch's StorageReader interface.

    • Supports configurable reading strategies via the reader_constructor parameter (see Reader Configurations).
    • Uses DCPOptimizedS3Reader by default for faster loading and partial checkpoint optimizations.
    • Please refer to DCPOptimizedS3Reader Errors for troubleshooting.
  • S3FileSystem: An implementation of PyTorch's FileSystemBase.

These tools enable seamless integration of Amazon S3 with PyTorch Distributed Checkpoints, allowing efficient storage and retrieval of distributed model checkpoints.

Prerequisites and Installation

PyTorch 2.3 or newer is required. To use the distributed checkpoints feature, install S3 Connector for PyTorch with the dcp extra:

pip install s3torchconnector[dcp]

Sample Example

End-to-end examples for using distributed checkpoints with S3 Connector for PyTorch can be found in the examples/dcp directory.

from s3torchconnector.dcp import S3StorageWriter, S3StorageReader

import torchvision
import torch.distributed.checkpoint as DCP

# Configuration
CHECKPOINT_URI = "s3://<BUCKET>/<KEY>/"
REGION = "us-east-1"

model = torchvision.models.resnet18()

# Save distributed checkpoint to S3
s3_storage_writer = S3StorageWriter(
    region=REGION, 
    path=CHECKPOINT_URI,
    thread_count=8, # optional; number of IO threads to use to write
) 
DCP.save(
    state_dict=model.state_dict(),
    storage_writer=s3_storage_writer,
)

# Load distributed checkpoint from S3
# S3StorageReader uses DCPOptimizedS3Reader by default for improved performance
model = torchvision.models.resnet18()
model_state_dict = model.state_dict()
s3_storage_reader = S3StorageReader(
    region=REGION, 
    path=CHECKPOINT_URI,
)
DCP.load(
    state_dict=model_state_dict,
    storage_reader=s3_storage_reader,
)
model.load_state_dict(model_state_dict)

S3 Prefix Strategies for Distributed Checkpointing

S3StorageWriter implements various prefix strategies to optimize checkpoint organization in S3 buckets. These strategies are specifically designed to prevent throttling (503 Slow Down errors) in high-throughput scenarios by implementing S3 key naming best practices as outlined in Best practices design patterns: optimizing Amazon S3 performance.

When many distributed training processes write checkpoints simultaneously, the prefixing strategies help distribute the load across multiple S3 partitions.

Available Strategies

1. RoundRobinPrefixStrategy

Distributes checkpoints across specified prefixes in a round-robin fashion, ideal for balancing data across multiple storage locations.

from s3torchconnector.dcp import RoundRobinPrefixStrategy, S3StorageWriter

model = torchvision.models.resnet18()

# Initialize with multiple prefixes and optional epoch tracking
strategy = RoundRobinPrefixStrategy(
    user_prefixes=["shard1", "shard2", "shard3"],
    epoch_num=5  # Optional: for checkpoint versioning
)

writer = S3StorageWriter(
    region=REGION,
    path="CHECKPOINT_URI",
    prefix_strategy=strategy
)

# Save checkpoint
DCP.save(
    state_dict=model.state_dict(),
    storage_writer=writer
)

Output Structure:

CHECKPOINT_URI
├── shard1/
│   └── epoch_5/
│       ├── __0_0.distcp
│       ├── __3_0.distcp
│       └── ...
├── shard2/
│   └── epoch_5/
│       ├── __1_0.distcp
│       ├── __4_0.distcp
│       └── ...
└── shard3/
    └── epoch_5/
        ├── __2_0.distcp
        ├── __5_0.distcp
        └── ...

2. BinaryPrefixStrategy

Generates binary (base-2) prefixes for optimal partitioning in distributed environments.

from s3torchconnector.dcp import BinaryPrefixStrategy

strategy = BinaryPrefixStrategy(
    epoch_num=1,          # Optional: for checkpoint versioning
    min_prefix_len=10     # Optional: minimum prefix length
)

Output Structure:

s3://my-bucket/checkpoints/
├── 0000000000/
│   └── epoch_1/
│       └── __0_0.distcp
├── 1000000000/
│   └── epoch_1/
│       └── __1_0.distcp
├── 0100000000/
│   └── epoch_1/
│       └── __2_0.distcp
└── ...

3. HexPrefixStrategy

Uses hexadecimal (base-16) prefixes for a balance of efficiency and readability.

from s3torchconnector.dcp import HexPrefixStrategy

strategy = HexPrefixStrategy(
    epoch_num=1,          # Optional: for checkpoint versioning
    min_prefix_len=4      # Optional: minimum prefix length
)

Output Structure:

s3://my-bucket/checkpoints/
├── 0000/
│   └── epoch_1/
│       └── __0_0.distcp
├── 1000/
│   └── epoch_1/
│       └── __1_0.distcp
...
├── f000/
│   └── epoch_1/
│       └── __15_0.distcp
└── ...

Creating Custom Strategies

You can implement custom prefix strategies by extending the S3PrefixStrategyBase class:

from s3torchconnector.dcp import S3PrefixStrategyBase

class CustomPrefixStrategy(S3PrefixStrategyBase):
    def __init__(self, custom_param):
        super().__init__()
        self.custom_param = custom_param

    def generate_prefix(self, rank: int) -> str:
        return f"custom_{self.custom_param}/{rank}/"

Parallel/Distributed Training

Amazon S3 Connector for PyTorch provides support for parallel and distributed training with PyTorch, allowing you to leverage multiple processes and nodes for efficient data loading and training. Both S3IterableDataset and S3MapDataset can be used for this purpose.

S3IterableDataset

The S3IterableDataset can be directly passed to PyTorch's DataLoader for parallel and distributed training. By default, all worker processes will share the same list of training objects. However, if you need each worker to have access to a unique portion of the dataset for better parallelization, you can enable dataset sharding using the enable_sharding parameter.

dataset = S3IterableDataset.from_prefix(DATASET_URI, region=REGION, enable_sharding=True)
dataloader = DataLoader(dataset, num_workers=4)

When enable_sharding is set to True, the dataset will be automatically sharded across available number of workers. This sharding mechanism supports both parallel training on a single host and distributed training across multiple hosts. Each worker, regardless of its host, will load and process a distinct subset of the dataset.

S3MapDataset

For the S3MapDataset, you need to pass it to DataLoader along with a DistributedSampler wrapped around it. The DistributedSampler ensures that each worker or node receives a unique subset of the dataset, enabling efficient parallel and distributed training.

dataset = S3MapDataset.from_prefix(DATASET_URI, region=REGION)
sampler = DistributedSampler(dataset)
dataloader = DataLoader(dataset, sampler=sampler, num_workers=4)

Lightning Integration

Amazon S3 Connector for PyTorch includes an integration for PyTorch Lightning, featuring S3LightningCheckpoint, an implementation of Lightning's CheckpointIO. This allows users to make use of Amazon S3 Connector for PyTorch's S3 checkpointing functionality with PyTorch Lightning.

Getting Started

Installation

pip install s3torchconnector[lightning]

Examples

End to end examples for the PyTorch Lightning integration can be found in the examples/lightning directory.

from lightning import Trainer
from s3torchconnector.lightning import S3LightningCheckpoint

...

s3_checkpoint_io = S3LightningCheckpoint("us-east-1")
trainer = Trainer(
    plugins=[s3_checkpoint_io],
    default_root_dir="s3://bucket_name/key_prefix/"
)
trainer.fit(model)

Using S3 Versioning to Manage Checkpoints

When working with model checkpoints, you can use the S3 Versioning feature to preserve, retrieve, and restore every version of your checkpoint objects. With versioning, you can recover more easily from unintended overwrites or deletions of existing checkpoint files due to incorrect configuration or multiple hosts accessing the same storage path.

When versioning is enabled on an S3 bucket, deletions insert a delete marker instead of removing the object permanently. The delete marker becomes the current object version. If you overwrite an object, it results in a new object version in the bucket. You can always restore the previous version. See Deleting object versions from a versioning-enabled bucket for more details on managing object versions.

To enable versioning on an S3 bucket, see Enabling versioning on buckets. Normal Amazon S3 rates apply for every version of an object stored and transferred. To customize your data retention approach and control storage costs for earlier versions of objects, use object versioning with S3 Lifecycle.

S3 Versioning and S3 Lifecycle are not supported by S3 Express One Zone.

Direct S3Client Usage

For advanced use cases, you can use the S3Client directly for custom streaming patterns and integration with existing pipelines.

from s3torchconnector._s3client import S3Client

REGION = "us-east-1"
BUCKET_NAME = "my-bucket"
OBJECT_KEY = "large_object.bin"

s3_client = S3Client(region=REGION)

# Writing data to S3
data = b"content" * 1048576
s3writer = s3_client.put_object(bucket=BUCKET_NAME, key=OBJECT_KEY)
s3writer.write(data)
s3writer.close()

# Reading data from S3
s3reader = s3_client.get_object(bucket=BUCKET_NAME, key=OBJECT_KEY)
data = s3reader.read()

Reader Configurations

Amazon S3 Connector for PyTorch supports three types of readers, configurable through S3ReaderConstructor.

Reader Types

1. Sequential Reader

  • Default for non-DCP use cases.
  • Downloads and buffers the entire S3 object in memory.
  • Prioritizes performance over memory usage by buffering entire objects.

2. Range-based Reader

  • Performs byte-range requests to read specific portions of S3 objects without downloading the entire object.
  • Prioritizes memory efficiency, with performance gains only for sparse partial reads in large objects.
  • Features adaptive buffering with forward overlap handling:
    • Small reads (< buffer_size): Use internal buffer to reduce S3 API calls.
    • Large reads (≥ buffer_size): Bypass buffer for direct transfer.

3. DCP-Optimized Reader

  • Default for PyTorch Distributed Checkpoint (DCP) loading with S3StorageReader.
  • Provides performance improvements through per-item buffers and zero-copy buffer management.
  • Enables efficient partial checkpoint loading (e.g. model-only) through selective data fetching with range coalescing.
  • Automatically handles range metadata injection from DCP load plan.
  • Requires sequential access patterns (automatically enforced in S3StorageReader.prepare_local_plan())

When to Use Each Reader

  • Sequential Reader: For processing entire objects, and when repeated access to the data is required. Best for most general use cases.
  • Range-based Reader: For larger objects (100MB+) that require sparse partial reads, and in memory-constrained environments.
  • DCP-Optimized Reader: For typical PyTorch Distributed Checkpoint loading scenarios for highest performance and memory-efficiency. (Default for S3StorageReader)

Note: S3Reader instances are not thread-safe and should not be shared across threads. For multiprocessing with DataLoader, each worker process creates its own S3Reader instance automatically.

Examples

For S3ReaderConstructor usage details, please refer to the S3ReaderConstructor documentation. Below are some examples for S3ReaderConstructor usage.

Direct method - S3Client usage with range-based reader without buffer:

# Direct S3Client usage for zero-copy partial reads into pre-allocated buffers, for memory efficiency and fast data transfer
from s3torchconnector._s3client import S3Client
from s3torchconnector import S3ReaderConstructor

s3_client = S3Client(region=REGION)
reader_constructor = S3ReaderConstructor.range_based(
    buffer_size=0  # No buffer, for direct transfer
)
s3reader = s3_client.get_object(
    bucket=BUCKET_NAME, 
    key=OBJECT_NAME, 
    reader_constructor=reader_constructor
)

buffer = bytearray(10 * 1024 * 1024)  # 10MB buffer
s3reader.seek(100 * 1024 * 1024)   # Skip to 100MB offset
bytes_read = s3reader.readinto(buffer)  # Direct read into buffer

DCP interface - S3StorageReader usage with dcp-optimized reader:

# Load checkpoint with dcp-optimized reader for better performance
from s3torchconnector.dcp import S3StorageReader
from s3torchconnector import S3ReaderConstructor

# dcp_optimized is already the default for S3StorageReader; demonstration purposes only. 
reader_constructor = S3ReaderConstructor.dcp_optimized()
s3_storage_reader = S3StorageReader(
    region=REGION, 
    path=CHECKPOINT_URI,
    reader_constructor=reader_constructor
)
DCP.load(
    state_dict=model_state_dict,
    storage_reader=s3_storage_reader,
)

Dataset interface - S3MapDataset usage with sequential reader:

# Use sequential reader for optimal performance when reading entire objects
from s3torchconnector import S3MapDataset, S3ReaderConstructor

dataset = S3MapDataset.from_prefix(
    DATASET_URI, 
    region=REGION,
    reader_constructor=S3ReaderConstructor.sequential()
)

for item in dataset:
    content = item.read()
    ...

Contributing

We welcome contributions to Amazon S3 Connector for PyTorch. Please see CONTRIBUTING for more information on how to report bugs or submit pull requests.

Development

See DEVELOPMENT for information about code style, development process, and guidelines.

Compatibility with other storage services

S3 Connector for PyTorch delivers high throughput for PyTorch training jobs that access or store data in Amazon S3. While it may be functional against other storage services that use S3-like APIs, they may inadvertently break when we make changes to better support Amazon S3. We welcome contributions of minor compatibility fixes or performance improvements for these services if the changes can be tested against Amazon S3.

Security issue notifications

If you discover a potential security issue in this project we ask that you notify AWS Security via our vulnerability reporting page.

Code of conduct

This project has adopted the Amazon Open Source Code of Conduct. See CODE_OF_CONDUCT.md for more details.

License

Amazon S3 Connector for PyTorch has a BSD 3-Clause License, as found in the LICENSE file.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

s3torchconnectorclient-1.5.0.tar.gz (85.5 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

s3torchconnectorclient-1.5.0-cp314-cp314-manylinux_2_28_x86_64.whl (3.7 MB view details)

Uploaded CPython 3.14manylinux: glibc 2.28+ x86-64

s3torchconnectorclient-1.5.0-cp314-cp314-manylinux_2_28_aarch64.whl (3.6 MB view details)

Uploaded CPython 3.14manylinux: glibc 2.28+ ARM64

s3torchconnectorclient-1.5.0-cp314-cp314-macosx_11_0_arm64.whl (2.0 MB view details)

Uploaded CPython 3.14macOS 11.0+ ARM64

s3torchconnectorclient-1.5.0-cp313-cp313-manylinux_2_28_x86_64.whl (3.7 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ x86-64

s3torchconnectorclient-1.5.0-cp313-cp313-manylinux_2_28_aarch64.whl (3.6 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ ARM64

s3torchconnectorclient-1.5.0-cp313-cp313-macosx_11_0_arm64.whl (2.0 MB view details)

Uploaded CPython 3.13macOS 11.0+ ARM64

s3torchconnectorclient-1.5.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (3.7 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ x86-64

s3torchconnectorclient-1.5.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (3.6 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ ARM64

s3torchconnectorclient-1.5.0-cp312-cp312-macosx_11_0_arm64.whl (2.0 MB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

s3torchconnectorclient-1.5.0-cp312-cp312-macosx_10_13_x86_64.whl (2.1 MB view details)

Uploaded CPython 3.12macOS 10.13+ x86-64

s3torchconnectorclient-1.5.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64

s3torchconnectorclient-1.5.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (3.6 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ ARM64

s3torchconnectorclient-1.5.0-cp311-cp311-macosx_11_0_arm64.whl (2.0 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

s3torchconnectorclient-1.5.0-cp311-cp311-macosx_10_12_x86_64.whl (2.1 MB view details)

Uploaded CPython 3.11macOS 10.12+ x86-64

s3torchconnectorclient-1.5.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ x86-64

s3torchconnectorclient-1.5.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (3.6 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ ARM64

s3torchconnectorclient-1.5.0-cp310-cp310-macosx_11_0_arm64.whl (2.0 MB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

s3torchconnectorclient-1.5.0-cp310-cp310-macosx_10_12_x86_64.whl (2.1 MB view details)

Uploaded CPython 3.10macOS 10.12+ x86-64

s3torchconnectorclient-1.5.0-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.17+ x86-64

s3torchconnectorclient-1.5.0-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (3.6 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.17+ ARM64

s3torchconnectorclient-1.5.0-cp39-cp39-macosx_11_0_arm64.whl (2.0 MB view details)

Uploaded CPython 3.9macOS 11.0+ ARM64

s3torchconnectorclient-1.5.0-cp39-cp39-macosx_10_12_x86_64.whl (2.1 MB view details)

Uploaded CPython 3.9macOS 10.12+ x86-64

s3torchconnectorclient-1.5.0-cp38-cp38-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (3.8 MB view details)

Uploaded CPython 3.8manylinux: glibc 2.17+ x86-64

s3torchconnectorclient-1.5.0-cp38-cp38-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (3.6 MB view details)

Uploaded CPython 3.8manylinux: glibc 2.17+ ARM64

s3torchconnectorclient-1.5.0-cp38-cp38-macosx_11_0_arm64.whl (2.0 MB view details)

Uploaded CPython 3.8macOS 11.0+ ARM64

s3torchconnectorclient-1.5.0-cp38-cp38-macosx_10_12_x86_64.whl (2.1 MB view details)

Uploaded CPython 3.8macOS 10.12+ x86-64

File details

Details for the file s3torchconnectorclient-1.5.0.tar.gz.

File metadata

  • Download URL: s3torchconnectorclient-1.5.0.tar.gz
  • Upload date:
  • Size: 85.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for s3torchconnectorclient-1.5.0.tar.gz
Algorithm Hash digest
SHA256 09ffceca1fd025abd8a4a4cbd94b3f70a7c8ccfbf3e0f76337e180f95ce58e61
MD5 416a3576736ca85f8b19efc1709855bd
BLAKE2b-256 a58de04febe3e7ff7c91bc4678a16bec1c87674fc9c160c75a8f8745e516e563

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp314-cp314-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp314-cp314-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 45bf53277fc2152a362db3177fca4d28b7142300ca0b83a063a929010595502b
MD5 d19243db307f5c6f2b43998dee2b73ed
BLAKE2b-256 ae3f532119548692f91a20ac0e720a116eb73a4330cb3e34d4862b0984b2ee15

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp314-cp314-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp314-cp314-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 805eab12f0f18eaeb0ea6b2dde24c5d9546a51e41a825218152082681f996c80
MD5 409085a44fe98d361699a4c778217633
BLAKE2b-256 8d631eb1919f3f167c3ee4ccc84264697b67888617233d26b798a149588d496a

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp314-cp314-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp314-cp314-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 6604f7b44c1c5d76682a65727255adf18ca2d49e4d97f46dccff101d9ed95ee5
MD5 5574d0224ab3284b17efdc3a24c0a3d4
BLAKE2b-256 4bfb32d9e7f3361c07722a6ed94d377c2523cd7166b0b8258f22f2f92a84eab8

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp313-cp313-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp313-cp313-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 e7e53ab3d066bf1a2508b5d5ad880245cb6af4e7d2f4e8edec63cec09b86ba2b
MD5 0113d4d5c8f04b6c973ae1f6b11881ad
BLAKE2b-256 1f2b1c99152a29da2d5936d20d3ff52bbcae064e612048b12c3d7f9b95df57a3

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp313-cp313-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp313-cp313-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 9c99e52176e2874e172afd4123ed3df818ad7752084a6eec45982356a58f2b90
MD5 8253f87dca188482cb0da768aa841b0a
BLAKE2b-256 aacb75e57ef933b95144569010788ff25ce3aee771d49aa2cc8946f0a6452844

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 f615a084432761242466c59c55bf7b0cf53555d903d33e5a8a638c7cc40569b3
MD5 b7ef050c400dfd6062e5d6762a30b97f
BLAKE2b-256 5a86a0cb960df36ebc42292bcdf9e0cd3b60e076d50b1abf9ab3cc5654856225

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 0f5277d76b4d1e12cd6f96823cf5911c51a7a614acbabb4ee4133d8caa332df1
MD5 932a189824df75c670a0e7adf18e75bc
BLAKE2b-256 35d39354e5620c3839393ff9afe2435f5e42bb63eb829edd93395cb0a3b1aa39

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl
Algorithm Hash digest
SHA256 7c0d11b4da0271414ffa370718bbbfb5454dac2ad546d89c7c6c49831e2eb7e5
MD5 0776432d1d8c4febe21c0ccb8af35f2f
BLAKE2b-256 7d51288b8857991cffa36b833c7128897766fb84f3a4a60a5cc3dfe6e2546f8a

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 1eba5cfc67d7e2bd3cd51400105288a979096cfb293c604d19cdd880f960c396
MD5 03f85c3491097169ccc533e67f929f16
BLAKE2b-256 e620629141bf19c24fedda41f9c710e55439d6303784cc1ca8e367367a51e08b

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp312-cp312-macosx_10_13_x86_64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp312-cp312-macosx_10_13_x86_64.whl
Algorithm Hash digest
SHA256 83ae3c096da011af6e57947d2530814a4f78935bf1336117547984da34e1cdec
MD5 c198e1c276af6efeb95178c3c4fddadb
BLAKE2b-256 caca65c66f2b4cc331f3d8fb92961f90edf8e9964fa6890ef7f335fbf9d7989f

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 ff53a57f092996e84d1149f58715bac1e55fc5ec9b5b8408efff14980c9aab57
MD5 a0bf257ece17c7782daadabef00f57c8
BLAKE2b-256 d8c01a0232e6350a84b32387d7779e0f5907a13768c12973463da948cf90c9c1

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.whl
Algorithm Hash digest
SHA256 eca11c10e61980f41ec292309b81ffbf1e0932a0f35ca8868df182a288cc85ae
MD5 4149828794399d07c42a74b99723e34d
BLAKE2b-256 0bcf9b95a31f3dd52e61db29f0d073b3dc0605a13e2f582ea97dfb97ad8d2424

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 fd2a67bdc9abcb7f46c741c7f89a0858910e9c7c1068e1eafdb30f21c4ee44c1
MD5 34ae4f0a0a06692faa05b5cdd8a0b21a
BLAKE2b-256 1435d2298baf9d6e8e21baa459e85c99762ee30cb16d90f0a68bd2eeb997be7c

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp311-cp311-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp311-cp311-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 d57d6e797128bae169cc099c502d773c5a8033662d037991b94d687dfa1b5f5d
MD5 49dc222d3a61d51f07ef7e6d00a2d18b
BLAKE2b-256 8cbf5ab5e80413c5a10f5a1c8839131722895852c11f8a4cf8abc224fb4fadb0

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 0c6f193d76584b56132d4618fbbd0ce34003d93e7c26029d7b90fde7d65ab987
MD5 2e4584487c5b0ae5dc536e8d97348884
BLAKE2b-256 b708b87972f794ee7dbfcd364e8a5584e2a1e3ba833b77e12694850921aa873d

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.whl
Algorithm Hash digest
SHA256 a678161fa92e5665a2cecfb0058df45bb77d18f9ee613b689f7b35e1978762e7
MD5 78d3806eedad1c14d115b59be15bad2c
BLAKE2b-256 eb120b7cb2be529d64f7e02b54d32d30a8b41b7c8593f264e17c2834e678538c

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 beb8d81cd4211c117ec0eaeb5e4538481f2eeb7f553d726a19733d544ee1bf39
MD5 4b55c24832e470b5dfa93fea5f0569e9
BLAKE2b-256 74339935071caa1e1b6a1578931b93e02d816e530bbd6cbb1a746eb172dc80df

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp310-cp310-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp310-cp310-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 737a3935023ac1c2684b694766020a8bcda3659366649465ff7e5f85dac639e4
MD5 35fec21552b21aeb4ebccaa396c57a32
BLAKE2b-256 219e3a7d9f6e25a312d11bc5ad7a669a1ee76bd0c95ab657dd85246c591805b2

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 8a9cda85eca520fc425fc03e013c9250e51e19f4adca0897ef4f16587fd856fa
MD5 64d31c830b112915eca0294dcfe38efd
BLAKE2b-256 b5430da2240544d3515d0a03597f21bcf211732e92921472e8d4a77fc20f5218

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.whl
Algorithm Hash digest
SHA256 6c99eb5e96302b6eeff1cbb6ebdbf709747b35400a7998bf8923da8804d21702
MD5 8113d408672dd5668d5885150b97a727
BLAKE2b-256 4e73d1e2dcabe30aef80a00846dbe383ef81180aaa3b8301a82bf64385fe4cd5

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 21af8a44117ffeff5a3f3cd51b3f213b025c2791449da2a98da0581ba03c1b3f
MD5 4e8d213088c2f1855abbb53deff643ab
BLAKE2b-256 ff210f5e738e3872c15a88ddfc85d09b2aded8198a506b7380a556bf5aa2c66d

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp39-cp39-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp39-cp39-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 aa3d41ebd39ba922b1a0b6effe0f99bcc537f312e0eaedd864978766a1b2e08b
MD5 847cf22b7d7509fa2d67745cbd7139ed
BLAKE2b-256 a63936a3408b29e2cc128dc59b34a472ff5c64c6bd716ce35131dda72aebd31c

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp38-cp38-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp38-cp38-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 3779dc1f6e30f8a7152c04034ee945745e0130825d8aef12e5630b53fea2575a
MD5 cd4889a415b01c8cfccf220c5e86d37a
BLAKE2b-256 1222356aa9e0ff1c4580a2bf428bb88b63f08c516f008dadc206ddd82feffc78

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp38-cp38-manylinux2014_aarch64.manylinux_2_17_aarch64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp38-cp38-manylinux2014_aarch64.manylinux_2_17_aarch64.whl
Algorithm Hash digest
SHA256 96ea6ea9153452855c297f2dbaa347b31a84d46d8357fa7ae071ffd48e074125
MD5 e3ec95d2db3492db4f732778c6744ada
BLAKE2b-256 afd16ee95433dc35b136f47ea3283ce8cbc4eb0dca936ffe02436665aca5dd54

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 d878ac8e0f666c6aafc408c3147586817baf82abdf341cabbf826d12462e150e
MD5 3f936e342dd7943df24d2e9120cd4d97
BLAKE2b-256 6e28fefe2895d19c5e8766678264014e0cc29020ba738b115c67b6cfb396d03d

See more details on using hashes here.

File details

Details for the file s3torchconnectorclient-1.5.0-cp38-cp38-macosx_10_12_x86_64.whl.

File metadata

File hashes

Hashes for s3torchconnectorclient-1.5.0-cp38-cp38-macosx_10_12_x86_64.whl
Algorithm Hash digest
SHA256 53b1dd3d1d1a32393b73b2d79f922f86ec2c7737c651d1ad9c32257a4c7a8e8b
MD5 8667e97b70fe1e8cfb47f16ec771348b
BLAKE2b-256 04890e160834a7765b0cbeacb0db7e4b360f5c7ac9758945d784ec3312210817

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page