A (growing) set of client-side APIs to access and utilize clusters, buckets, and objects on AIStore.
Project description
AIS Python SDK
AIS Python SDK provides a (growing) set of client-side APIs to access and utilize AIS clusters, buckets, and objects.
The project is, essentially, a Python port of the AIS Go APIs, with additional objectives that prioritize utmost convenience for Python developers.
Note that only Python 3.x (version 3.6 or later) is currently supported.
Installation
Install as a Package
The latest AIS release can be easily installed either with Anaconda or pip
:
$ conda install aistore
$ pip install aistore
Install From Source
If you'd like to work with the current upstream (and don't mind the risk), install the latest master directly from GitHub:
$ git clone https://github.com/NVIDIA/aistore.git
$ cd aistore/python/
$ pip install -e .
Quick Start
In order to interact with your running AIS instance, you will need to create a client
object:
from aistore.sdk import Client
client = Client("http://localhost:8080")
The newly created client
object can be used to interact with your AIS cluster, buckets, and objects. Here are a few ways to do so:
# Check if AIS is deployed and running
client.cluster().is_aistore_running()
# Get cluster information
client.cluster().get_info()
# Create a bucket named "my-ais-bucket"
client.bucket("my-ais-bucket").create()
# Delete bucket named "my-ais-bucket"
client.bucket("my-ais-bucket").delete()
# Head bucket
client.bucket("my-ais-bucket").head()
# Head object
client.bucket("my-ais-bucket").object("my-object").head()
# Put Object
client.bucket("my-ais-bucket").object("my-new-object").put("path-to-object")
If you are using AIS buckets, you can simply omit the provider argument (defaults to ProviderAIS) when instantiating a bucket object (
client.bucket("my-ais-bucket").create()
is equivalent toclient.bucket("my-ais-bucket", provider="ais").create()
).
Working with multiple objects
AIS supports multi-object operations on groups of objects. An ObjectGroup
can be created with one of:
- a list of object names
- an ObjectRange
- a string template.
# Create Object Group by list of names
my_objects = client.bucket("my-ais-bucket").objects(obj_names=["my-obj-1", "my-obj-2", "my-obj-3"])
# Create Object Group by ObjectRange
my_object_range = ObjectRange(prefix="my-obj", min_index="1", max_index="3")
my_objects = client.bucket("my-ais-bucket").objects(obj_range=my_object_range)
String templates can be passed directly to AIS following the syntax described here
# Create Object Group by Template String
my_object_template = "my-obj-{1..3}"
my_objects = client.bucket("my-ais-bucket").objects(obj_template=my_object_template)
# More advanced template example with multiple ranges and defined steps
complex_range = "my-obj-{0..10..2}-details-{1..9..2}-.file-extension"
# Delete Multiple Objects
my_objects.delete()
# Evict Multiple Objects
my_objects.evict()
# Prefetch Multiple Objects
my_objects.prefetch()
External Cloud Storage Buckets
AIS supports a number of different backend providers or, simply, backends.
For exact definitions and related capabilities, please see terminology.
Many bucket/object operations support remote cloud buckets (third-party backend-based cloud buckets), including a few of the operations shown above. To interact with remote cloud buckets, you need to specify the provider of choice when instantiating your bucket object as follows:
# Head AWS bucket
client.bucket("my-aws-bucket", provider="aws").head()
# Evict GCP bucket
client.bucket("my-gcp-bucket", provider="gcp").evict()
# Get object from Azure bucket
client.bucket("my-azure-bucket", provider="azure").object("filename.ext").get()
# List objects in AWS bucket'
client.bucket("my-aws-bucket", provider="aws").list_objects()
Please note that certain operations do not support external cloud storage buckets. Please refer to the API reference documentation for more information on which bucket/object operations support remote cloud buckets, as well as general information on class and method usage.
ETLs
AIStore also supports ETLs, short for Extract-Transform-Load. ETLs with AIS are beneficial given that the transformations occur locally, which largely contributes to the linear scalability of AIS.
Note: AIS-ETL requires Kubernetes. For more information on deploying AIStore with Kubernetes (or Minikube), refer here.
The following example is a sample workflow involing AIS-ETL.
We can initialize ETLs with either code or spec.
We initialize an ETL w/ code:
import hashlib
# Defining ETL transformation code
def transform(input_bytes):
md5 = hashlib.md5()
md5.update(input_bytes)
return md5.hexdigest().encode()
# Initializing ETL with transform()
client.etl().init_code(transform=transform, etl_name="etl-code")
We initialize another ETL w/ spec:
from aistore.sdk.etl_templates import MD5
template = MD5.format(communication_type="hpush")
client.etl().init_spec(template=template, etl_name="etl-spec")
Refer to more ETL templates here.
Once initialized, we can verify the ETLs are running with method list()
:
# List all running ETLs
client.etl().list()
We can get an object with the ETL transformations applied:
# Get object w/ ETL code transformation
obj1 = client.bucket("bucket-demo").object("object-demo").get(etl_name="etl-code").read_all()
# Get object w/ ETL spec transformation
obj2 = client.bucket("bucket-demo").object("object-demo").get(etl_name="etl-spec").read_all()
Alternatively, we can transform an entire bucket's contents as follows:
# Transform bucket w/ ETL code transformation
client.bucket("bucket-demo").transform(etl_name="etl-code", to_bck="bucket-transformed")
# Transform bucket w/ ETL spec transformation
client.bucket("bucket-demo").transform(etl_name="etl-spec", to_bck="bucket-transformed")
Transform also allows for on-the-fly rename operations for objects:
# Add a prefix to the resulting transformed files:
client.bucket("bucket-demo").transform(etl_name="etl-code", to_bck="bucket-transformed", prefix="transformed-")
# Replace existing filename extensions
client.bucket("bucket-demo").transform(etl_name="etl-spec", to_bck="bucket-transformed", ext={"jpg":"txt"})
We can stop the ETLs if desired with method stop()
:
# Stop ETL
client.etl().stop(etl_name="etl-code")
client.etl().stop(etl_name="etl-spec")
# Verify ETLs are not actively running
client.etl().list()
If an ETL is stopped, any Kubernetes pods created for the ETL are stopped, but not deleted. Any transforms by the stopped ETL are terminated. Stopped ETLs can be resumed for use with method start()
:
# Stop ETLs
client.etl().start(etl_name="etl-code")
client.etl().start(etl_name="etl-spec")
# Verify ETLs are not actively running
client.etl().list()
Once completely finished with the ETLs, we cleanup (for storage) by stopping the ETLs with stop
and substenquently deleting the ETLs with delete
:
# Stop ETLs
client.etl().stop(etl_name="etl-code")
client.etl().stop(etl_name="etl-spec")
# Delete ETLs
client.etl().delete(etl_name="etl-code")
client.etl().delete(etl_name="etl-spec")
Deleting an ETL deletes all pods created by Kuberenetes for the ETL as well as any specifications for the ETL on Kubernetes. Consequently, deleted ETLs cannot be started again and will need to be re-initialized.
For an interactive demo, refer here.
More Examples
For more in-depth examples, please see AIStore Python SDK Examples Directory.
API Documentation
Module | Summary |
---|---|
api.py | Contains Client class, which has methods for making HTTP requests to an AIStore server. Includes factory constructors for Bucket , Cluster , and Job classes. |
cluster.py | Contains Cluster class that represents a cluster bound to a client and contains all cluster-related operations, including checking the cluster's health and retrieving vital cluster information. |
bucket.py | Contains Bucket class that represents a bucket in an AIS cluster and contains all bucket-related operations, including (but not limited to) creating, deleting, evicting, renaming, copying. |
object.py | Contains class Object that represents an object belonging to a bucket in an AIS cluster, and contains all object-related operations, including (but not limited to) retreiving, adding and deleting objects. |
object_group.py | Contains class ObjectGroup , representing a collection of objects belonging to a bucket in an AIS cluster. Includes all multi-object operations such as deleting, evicting, and prefetching objects. |
job.py | Contains class Job and all job-related operations. |
etl.py | Contains class Etl and all ETL-related operations. |
For more information on API usage, refer to the API reference documentation.
PyTorch Integration
You can list and load data from AIS buckets (buckets that are not 3rd party backend-based) and remote cloud buckets (3rd party backend-based cloud buckets) in PyTorch using AISFileLister and AISFileLoader.
AISFileLister
and AISFileLoader
are now available as a part of official pytorch/data project.
from torchdata.datapipes.iter import AISFileLister, AISFileLoader
# provide list of prefixes to load and list data from
ais_prefixes = ['gcp://bucket-name/folder/', 'aws:bucket-name/folder/', 'ais://bucket-name/folder/', ...]
# List all files for these prefixes using AISFileLister
dp_ais_urls = AISFileLister(url='localhost:8080', source_datapipe=ais_prefixes)
# print(list(dp_ais_urls))
# Load files using AISFileLoader
dp_files = AISFileLoader(url='localhost:8080', source_datapipe=dp_ais_urls)
for url, file in dp_files:
pass
An Alternative: Using Boto3 (and botocore)
As an alternative to the AIStore Python API, you may prefer to use Amazon's popular Boto3 library, or possibly botocore, which boto3 uses under the hood.
By default, botocore doesn't handle HTTP redirects, which prevents you from using it with AIStore.
To resolve this, install aistore
with the botocore
extra, and then import aistore.botocore_patch.botocore
in your code. This will monkey patch HTTP redirect support into botocore.
$ pip install aistore[botocore]
import boto3
from aistore.botocore_patch import botocore
References
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file aistore-1.0.7.tar.gz
.
File metadata
- Download URL: aistore-1.0.7.tar.gz
- Upload date:
- Size: 27.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.8.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4c41a20a887dc3e9e3f1d596eac6422f4cb8b5e8163108506f083cd4df04d11d |
|
MD5 | 688e5e7008ea87e1eb73ac1180d1c882 |
|
BLAKE2b-256 | 7eb2379044c5ca4a51993e85c92796f85e17c7e81834f4fdb0aeb7461645ea0f |
File details
Details for the file aistore-1.0.7-py3-none-any.whl
.
File metadata
- Download URL: aistore-1.0.7-py3-none-any.whl
- Upload date:
- Size: 31.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.8.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a6923bc69210c1aa4c2fe3edceb8a4269fbc05d0c84d99dd0f9c6a950d5cc944 |
|
MD5 | 890f3ee75e306a6c9f68236bd0a09180 |
|
BLAKE2b-256 | 796967e277e52c4555457ed63ef3ad0f6acec2faa7a891791715189e7672be83 |