Skip to main content

Client and convenient connectors for PyTorch and TensorFlow to AIStore cluster

Project description

AIS Python SDK

AIS Python SDK provides a (growing) set of client-side APIs to access and utilize AIS clusters.

The project is, essentially, a Python port of the AIS Go APIs, with additional objectives that include:

  • utmost convenience for Python developers;
  • minimal, or no changes whatsoever, to apps that already use S3.

Note that only Python 3.x (version 3.6 or later) is currently supported.

Installation

Install from a package

The latest AIS release can be easily installed either with Anaconda (recommended) or pip:

$ conda install aistore
$ pip install aistore

Install from the sources

If you'd like to work with the current upstream (and don't mind the risk), install the latest master directly from GitHub:

$ cd sdk/python # If you are not here already.
$ pip install -e .

Quick start

If you've already used Python SDK library for AWS (aka Boto3), AIS SDK should be very familiar.

Similar to Boto3, the steps include:

  1. First, initialize the connection to storage by creating a client.
  2. Second, call client methods with assorted (and explicitly enumerated) named arguments.

Names of the most common operations are also identical, e.g.:

  • create_bucket - create a new empty bucket
  • put_object - upload an object to a bucket

and so on.

AIS supports multiple backends

AWS works only with one kind of buckets - AWS buckets. AWS SDK functions accept only the bucket name, e.g. create_bucket(Bucket="bck").

AIS, on the other hand, supports a number of different backend providers or, simply, backends.

For exact definitions and related capabilities, please see terminology

And so, for AIS a bucket name, strictly speaking, does not define the bucket.

That is why majority of the SDK functions accept two arguments:

  • bck_name - for bucket name, and
  • optional provider - for backend provider.

The default provider is ProviderAIS (see const.py for this and other system constants).

If you only work with AIS buckets, in most cases you can simply omit the provider.

Calling Client methods

Every Client method can be called in two ways: with named arguments in arbitrary order and with positional arguments. For instance, list_objects method is declared as:

def list_objects(self,
     bck_name: str,
     provider: str = ProviderAIS,
     prefix: str = "",
     props: str = "",
     count: int = 0,
     page_size: int = 0,
) -> List[BucketEntry]:

To get first 10 objects of AIS bucket bck1 which names start with img-, execute either with positional arguments:

objects = client.list_objects("bck1", ProviderAIS, "img-", "", 10)

or with named ones:

# ProviderAIS is omitted because it is default value for a provider argument
objects = client.list_objects(bck_name="bck1", prefix="img-", count=10)

Example

from aistore.client.api import Client
from aistore.client.const import ProviderAIS

# Assuming that AIStore server is running on the same machine
client = Client("http://localhost:8080")

# Create a new AIS bucket.
# Note: this function does not accept 'provider' because AIStore SDK supports creating of AIS buckets only.
client.create_bucket("bck")

# List the buckets.
# By default, it returns only AIS buckets. If you want to get all buckets including Cloud ones,
# pass empty string as a provider:
#   bucket_list = client.list_buckets(provider = "")
# The call below is the same as 'bucket_list = client.list_buckets(provider = ProviderAIS)'
bucket_list = client.list_buckets()

# Put an object to the new bucket. The object content is read from a local file '/tmp/obj1_content'
# The method returns properties of the new object like 'ETag'.
# Argument 'provider' is optional and can be omitted in this example. It is added for clarity.
obj_props = client.put_object(bck_name="bck", obj_name="obj1", path="/tmp/obj1_content", provider=ProviderAIS)

# Destroy the bucket and its content.
# Note: this function also does not accept 'provider' because AIStore SDK supports destroying of AIS buckets only.
client.destroy_bucket("bck")

References

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aistore-0.1.0.tar.gz (10.0 kB view details)

Uploaded Source

Built Distribution

aistore-0.1.0-py3-none-any.whl (8.3 kB view details)

Uploaded Python 3

File details

Details for the file aistore-0.1.0.tar.gz.

File metadata

  • Download URL: aistore-0.1.0.tar.gz
  • Upload date:
  • Size: 10.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.9.6

File hashes

Hashes for aistore-0.1.0.tar.gz
Algorithm Hash digest
SHA256 f8b306636e84e6916fa428d37d2d7642994a5f8880d5b035d86a456c6b4978cc
MD5 5f2607690c4df1e181d445720547f383
BLAKE2b-256 9d218270668fb15a21bb5a9840af39bd080647b184a761fe4e3084b81b66d68d

See more details on using hashes here.

File details

Details for the file aistore-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: aistore-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 8.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.9.6

File hashes

Hashes for aistore-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 09c8257a083058c703a9268ff3b0f969ec09106229067fa78bb446b0a6ad6af7
MD5 2c54b90088ff6ce8d91e535d6517876c
BLAKE2b-256 322e0588cc5ac132222309de6a418c67d363edb2ab408e357d7f0e7429428eff

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page