MutableMapping interfaces for common cloud storage providers
Project description
cloud-mappings
MutableMapping implementations for common cloud storage providers
For now Azure Blob Storage, Azure Table Storage, Google Cloud Storage, and AWS S3 are implemented. Contributions of new providers are welcome.
Installation
with pip:
pip install cloud-mappings
By default, cloud-mappings
doesn't install any of the required storage providers dependencies. If you would like to install them alongside cloud-mappings
you may run any combination of:
pip install cloud-mappings[azureblob,azuretable,gcpstorage,awss3]
Instantiation
AzureBlobMapping:
from cloudmappings import AzureBlobMapping
cm = AzureBlobMapping.with_pickle(
account_url="AZURE_BLOB_STORAGE_URL",
container_name="CONTAINER_NAME",
credential=AZURE_CREDENTIAL_OBJECT,
)
AzureTableMapping:
from cloudmappings import AzureTableMapping
cm = AzureTableMapping.with_pickle(
connection_string="AZURE_TABLE_CONNECTION_STRING",
table_name="TABLE_NAME",
)
Note that Azure Table Storage has a 1MB size limit per entity.
GoogleCloudStorageMapping:
from cloudmappings import GoogleCloudStorageMapping
cm = GoogleCloudStorageMapping.with_pickle(
project="GCP_PROJECT",
credentials=GCP_CREDENTIALS_OBJECT,
bucket_name="BUCKET_NAME",
)
AWSS3Mapping:
from cloudmappings import AWSS3Mapping
cm = AWSS3Mapping.with_pickle(
bucket_name="AWS_BUCKET_NAME",
silence_warning=False,
)
Note that AWS S3 does not support server-side atomic requests, so it is not recommended for concurrent use. A warning is printed out by default but may be silenced by passing silence_warning=True
.
Usage
Use it just like a standard dict()
!
cm["key"] = 1000
cm["key"] # returns 1000
del cm["key"]
"key" in cm # returns false
Cloud Sync
Each cloud-mapping
keeps an internal dict of etags which it uses to ensure it is only reading/overwriting/deleting data it expects to. If the value in storage is not what the cloud-mapping
expects, a cloudmappings.errors.KeySyncError()
will be thrown. If you know what you are doing and want your operation to go through anyway, you will need to sync your cloud-mapping
with the cloud by calling either .sync_with_cloud()
to sync all keys or .sync_with_cloud(key)
to sync a specific key. By default .sync_with_cloud()
is called on instantiation of a cloud-mapping
if the underlying provider storage already exists. You may skip this initial sync by passing an additional sync_initially=False
parameter when you instantiate your cloud-mapping
.
Serialisation
If you don't call .with_pickle()
and instead pass your providers configuration directly to the CloudMapping
class, you will get a "raw" cloud-mapping
which accepts only byte-likes as values. Along with the .with_pickle()
serialisation utility, .with_json()
and .with_json_zlib()
also exist.
You may build your own serialisation either using zict; or by calling .with_buffers([dumps_1, dumps_2, ..., dumps_N], [loads_1, loads_2, ..., loads_N])
, where dumps
and loads
are the ordered functions to serialise and parse your data respectively.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file cloud-mappings-0.8.0.tar.gz
.
File metadata
- Download URL: cloud-mappings-0.8.0.tar.gz
- Upload date:
- Size: 10.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/4.3.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.6.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c2912fcad906b5f16bd08870fa76f4b5e30137f17715b71b1bb1c08f2eb25883 |
|
MD5 | 7ee78e48cd41d3c0e8a2949a1f4d19be |
|
BLAKE2b-256 | 0c50fead531d7bf5fa987b95d2f2d0920fc32a3ec77dd940095fad47dc6e037b |
File details
Details for the file cloud_mappings-0.8.0-py3-none-any.whl
.
File metadata
- Download URL: cloud_mappings-0.8.0-py3-none-any.whl
- Upload date:
- Size: 12.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/4.3.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.6.13
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3864c2ddbaebe79f1836682fcbec2cba1d0168e250066b1c9d856de56d8ea581 |
|
MD5 | 609c44e620a262f36b785c16d1333e38 |
|
BLAKE2b-256 | 6737eea466ba0bbf81987b2212e9b4834dbae1a2fe5a7c8f50079822e9e7a5fd |