MutableMapping interfaces for common cloud storage providers
Project description
cloud-mappings
MutableMapping implementations for common cloud storage providers
For now Azure Blob Storage, Google Cloud Storage, and AWS S3 are implemented. Contributions of new providers are welcome.
Installation
with pip:
pip install cloud-mappings
Instantiation
AzureBlobMapping:
from cloudmappings import AzureBlobMapping
cm = AzureBlobMapping.with_pickle(
account_url="AZURE_BLOB_STORAGE_URL",
container_name="CONTAINER_NAME",
credential=AZURE_CREDENTIAL_OBJECT,
)
GoogleCloudStorageMapping:
from cloudmappings import GoogleCloudStorageMapping
cm = GoogleCloudStorageMapping.with_pickle(
project="GCP_PROJECT",
credentials=GCP_CREDENTIALS_OBJECT,
bucket_name="BUCKET_NAME",
)
AWSS3Mapping:
from cloudmappings import AWSS3Mapping
cm = AWSS3Mapping.with_pickle(
bucket_name="AWS_BUCKET_NAME",
silence_warning=False,
)
Note that AWS S3 does not support server-side atomic requests, so it is not recommended for concurrent use. A warning is printed out by default but may be silenced by passing silence_warning=True
.
Usage
Use it just like a standard dict()
!
cm["key"] = 1000
cm["key"] # returns 1000
del cm["key"]
"key" in cm # returns false
Cloud Sync
Each mapping keeps an internal dict of etags which it uses to ensure it is only reading/overwriting/deleting data it expects to. If the value in storage is not what the mapping expects, a cloudmappings.errors.KeySyncError
will be thrown. If you want your operation to go through anyway, you will need to sync your mapping with the cloud by calling either .sync_with_cloud()
or .sync_with_cloud(key)
. By default .sync_with_cloud()
is called on instantiation if the underlying provider storage already exists. You may skip this initial sync by passing an additional sync_initially=False
parameter when you instantiate your mapping.
Serialisation
If you don't call .with_pickle()
and instead pass your providers configuration directly to the mapping object, you will get a "raw" mapping which only accepts byte-likes as values. You may build your own serialisation either using zict, or calling .with_buffers([dumps_1, loads_1, dumps_2, loads_2, ...])
where dumps
and loads
are the ordered functions to serialisation and deserialisation your data respectively.
The following utilities exist as simple starting points: .with_pickle()
, .with_json()
, .with_json_zlib()
.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file cloud-mappings-0.7.3.tar.gz
.
File metadata
- Download URL: cloud-mappings-0.7.3.tar.gz
- Upload date:
- Size: 7.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.7.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0cd7138771e742dc1724a5b097084dedae31c53a5a5f4c1851a6b8dca07ad0f6 |
|
MD5 | e93589e7d629a730e4b547a916b4f455 |
|
BLAKE2b-256 | 1a36bb061cbd8c58a0dd5af5ffc480d9838ee5e6acafa26bee40543d58604a41 |
File details
Details for the file cloud_mappings-0.7.3-py3-none-any.whl
.
File metadata
- Download URL: cloud_mappings-0.7.3-py3-none-any.whl
- Upload date:
- Size: 10.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.7.10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f07c4a33d1dbffb90e0f54ef7215cf92062f5edfbf674b6380cb3edae20f0bcc |
|
MD5 | eb8085e34ad8b2757d97ab49336eb7b1 |
|
BLAKE2b-256 | a71f7f55c6c3d88a9e43b1f48543024acb3c5e5b0a64f3f52ee3a8f981b069f3 |