Skip to main content

python dictionary class providing persistent storage by serializing state to a json file on an Amazon S3 bucket

Project description

S3Dictionary Homepage

Code style: black License: MIT

What is S3Dictionary?

Implements a persistent python dictionary class (inherited from collections.UserDict) that serializes its state as a json file in an Amazon AWS S3 bucket. This works well for simple persistence if all you need is to save some key/value pairs.

S3Dictionary should be a drop-in replacement for a standard python dict.

If you think you need this module, you probably need something else. S3Dictionary is simple and easy to use though if all you need is a persitent dictionary. With lazy saving (autosave=False), it's also reasonably fast.

Installation instructions

python setup.py install

Example uses of the module

from s3dictionary import S3Dict

if __name__ == "__main__":

    """ create an instance of S3Dict """
    state = S3Dict(
        bucket_name=AWS_BUCKET_NAME,
        access_key_id=AWS_ACCESS_KEY_ID,
        access_secret_key=AWS_ACCESS_SECRET_KEY,
        file_name="test.json",
    )

    """ assignment like any python dictionary """
    state["foo"] = "bar"

    """ save back to S3 bucket """
    state.save()

See example.py for additional usage examples.

Module Interface

S3Dict(bucket_name=AWS_BUCKET_NAME,
       access_key_id=AWS_ACCESS_KEY_ID,
       access_secret_key=AWS_ACCESS_SECRET_KEY,
       file_name="filename.json",
       [autosave=False],
       [data=None]),
       [default=None],

Construct a new S3Dict object. If file_name exists, restores state from file. If file_name does not exist, initialize a new dictionary. In the case the file does not already exist, it will not be created in the S3 bucket until you call save() unless autosave=True.

If a dictionary is passed in data the dictionary will be initialized with what was passed in data. This will overwrite any data that was loaded from file. If a dictionary is passed in default and the file does not exist, the dictionary will be initialized with what was passed in default. Unlike the data argument, default will not overwrite any existing data.

Required arguments

  • bucket_name: name of S3 bucket
  • acess_key_id: AWS access key id
  • access_secret_key: AWS secret key
  • file_name: file name of the serialized dictionary to be stored in S3 bucket

Optional arguments

  • autosave: must be bool, default is False -- setting this to True causes every update to the underlying dictionary to be immediately serialized to S3
  • data: python dictionary used to initialize data
  • default: python dictionary used to initialize data if the file does not already exist in the S3 bucket
S3Dict.save()

Force state to be serialized to S3 bucket

S3Dict.load()

Force state to be immediately loaded from S3 bucket. In normal circumstances, you won't need to do this. It might be useful if an external processes updates the state information and you want to force a reload.

S3Dict.autosave = True|False
S3Dict.autosave

Turn autosave on (True) or off (False). Also a property to read autosave value. If autosave is True, every update to the dictionary will immediately be serialized back to the S3 bucket. Please be sure you understand what this means before turning autosave on!

S3Dict.file_name
S3Dict.access_key_id
S3Dict.access_secret_key
S3Dict.bucket_name

Read only properties

S3Dict.delete()

Clear the stored data and deletes the json file from S3. The user must have S3 DeleteObject permissions. This does not delete the actual S3Dict object instance. To do that, follow call S3Dict.delete() with del.

from s3dictionary import S3Dict

mys3dict = S3Dict(...)
...
mys3dict.delete() # deletes the data and the file
del mys3dict # deletes the object

Usage Notes

To use S3Dict, you must first create an Amazon S3 bucket and a user with permissions to access the bucket.

I highly recommend Keith Weaver's excellent tutorial on using AWS S3 with python and setting up buckets and users.

Be careful with autosave as it could cost you money--if your underlying dict changes a lot, autosave will cause S3Dict to pound your S3 account with frequent PUT commands which could result in charges. It is also costly in terms of execution speed as every change to the underlying dictionary forces an interaction with the S3 server.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

S3Dictionary-0.13.tar.gz (4.9 kB view details)

Uploaded Source

Built Distribution

S3Dictionary-0.13-py3-none-any.whl (7.0 kB view details)

Uploaded Python 3

File details

Details for the file S3Dictionary-0.13.tar.gz.

File metadata

  • Download URL: S3Dictionary-0.13.tar.gz
  • Upload date:
  • Size: 4.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.7.3 requests-toolbelt/0.8.0 tqdm/4.28.1 CPython/3.7.2

File hashes

Hashes for S3Dictionary-0.13.tar.gz
Algorithm Hash digest
SHA256 2d5c17fa39faabb0e6668032b6a7509b29289966506bdbf821a18e9235d16654
MD5 4a67a4b4ec47229607133e99b62e6790
BLAKE2b-256 b7f070835e1f5f1cf93ff4217ecf2b66936968d8d3bd00d3fbb1a8ea4b8071e6

See more details on using hashes here.

File details

Details for the file S3Dictionary-0.13-py3-none-any.whl.

File metadata

  • Download URL: S3Dictionary-0.13-py3-none-any.whl
  • Upload date:
  • Size: 7.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.5.0.1 requests/2.21.0 setuptools/40.7.3 requests-toolbelt/0.8.0 tqdm/4.28.1 CPython/3.7.2

File hashes

Hashes for S3Dictionary-0.13-py3-none-any.whl
Algorithm Hash digest
SHA256 2951b90c2a34c31f7d58bcfeea97c399e8b7929e560c069380fc859e6fac4e65
MD5 4aca5199ea244238f0731219c6017f66
BLAKE2b-256 dccc4dba74716245162bf1cb0278c606374e7cca78a34504ec4d26e20ce97616

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page