Skip to main content

A python class for using redis, or other key value stores, and caching the values for read heavy workloads

Project description

Build Status PyPI version

A python class for using redis, or other key value stores, as a dictionary and caching the values locally for read heavy workloads. Heavily inspired by pylru.

Usage

The idea is that Deleting or Updating keys in an instance of PubSubRedisDict or PubSubCacheManager will update the matching cached keys in all instances of PubSubCacheManager. PubSubCacheManager will therefor maintain a cache of recently used keys using an lru or just a straight up dict(). This will reduce the round trip latency and network overhead for any reads of the cached keys.

RedisDict and RedisDict should work with instances of redis.StrictRedis or rediscluster.StrictRedisCluster. Use the prefix for managing redis key namespaces.

RedisDict

Just like a normal dictionary, but networked. Initialisation wont take a dictionary or iterable for now as it need connection and namespace information.

rc = StrictRedisCluster(startup_nodes=[{"host": "redis", "port": "6379"}])
reddict = RedisDict(rc, 'namespace')

# you can set
reddict[1] = 1
reddict[2] = [1,2,3]
reddict['hello'] = 'world'
reddict[('complex',1)] = {'I': {'Am': {'Quite': ['a', 'complex', {'object': {} }]}}}

# get somewhere else
reddict[1]
reddict['1'] # note its the same as reddict[1]
reddict[('complex',1)]
reddict["('complex', 1)"] # the key is str(('complex',1))

# delete
del reddict[1]
# .. ect

PubSubRedisDict

Like RedisDict but will publish key update and delete events to a <namespace>/[update|delete] channel.

redpubsub = PubSubRedisDict(rc, 'namespace')
# ect as before

PubSubCacheManager

Like pylry.WriteThroughCacheManager but updates cache keys from store when it receives a message from the <namespace>/[update|delete] channel.

cache = pylru.lrucache(10) # maybe more than 10
redstore = PubSubRedisDict(rc, 'namespace')
redcache = PubSubCacheManager(redstore, cache)
# ect as before
# see the cache
print dict(redcache.cache)

Further uses

You can hook up RedisDict or PubSubRedisDict to pylru.WriteBackCacheManager to get a Redis backed dictionary which only writes to Redis on ‘flush’ or when the item pops off the lru for write intensive workloads. However a lot more work would need to be done to add the pubsub mechanism as there difficult cases to consider, such as what happens when the cache is dirty and we get notified that the store key is updated?

Limitations

  • all keys are strings.

  • msgpack is used to marshal objects to redis, so msgpack object limitations apply. Though you can monkey patch the modules loads and dumps method if you like.

  • publish will publish to all consuming dictionary instances, there is no partitioning, so writes and updates are expensive. You could come up with a partitioning strategy to improve this.

  • The published items eventually end up in the watched cash. There may be a time lag between a client publishing a change and the key updating in another clients cache.

References

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

redis_pubsub_dict_wo_cluster-0.0.3.tar.gz (11.7 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file redis_pubsub_dict_wo_cluster-0.0.3.tar.gz.

File metadata

File hashes

Hashes for redis_pubsub_dict_wo_cluster-0.0.3.tar.gz
Algorithm Hash digest
SHA256 b418677aa0f0a2f72ddaca0c9f437596e82fca21de298fd9854d85383cdfc7b8
MD5 352714e80b7153748d00f5e45cf208a7
BLAKE2b-256 383755e5158b19ed279c1db6341215e884354e2e39e3a5764297d49eecb74bec

See more details on using hashes here.

File details

Details for the file redis_pubsub_dict_wo_cluster-0.0.3-py3-none-any.whl.

File metadata

File hashes

Hashes for redis_pubsub_dict_wo_cluster-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 5a28c7a5556313fe79bbbbd04c62ada3ac9b1d0755cacfab381a1a0995968ee9
MD5 4a49fb7e257bff05201dd070ab1cabb6
BLAKE2b-256 20cb01b18eb75aa8306999c1d17d461de4dc453f1afae2240b71bb0fbe8b74d1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page