Skip to main content

Base builtin tools make and transform data object layers (dols).

Project description

dol

Base builtin tools make and transform data object layers (dols).

The package is light-weight: Pure python; not third-party dependencies.

To install: pip install dol

Documentation here

Note: This project started as py2store. dol is the core of py2store has now been factored out and many of the specialized data object layers moved to separate packages. py2store is acting more as an aggregator package -- a shoping mall where you can quickly access many (but not all) functionalities that use dol.

It's advised to use dol (and/or its specialized spin-off packages) directly when the core functionality is all you need.

A few highlights of py2store's README

What is this?

Storage CRUD how and where you want it.

List, read, write, and delete data in a structured data source/target, as if manipulating simple python builtins (dicts, lists), or through the interface you want to interact with, with configuration or physical particularities out of the way. Also, being able to change these particularities without having to change the business-logic code.

If you're not a "read from top to bottom" kinda person, here are some tips: Quick peek will show you a simple example of how it looks and feels. Use cases will give you an idea of how py2store can be useful to you, if at all.

The section with the best bang for the buck is probably remove (much of the) data access entropy. It will give you simple (but real) examples of how to use py2store tooling to bend your interface with data to your will.

How it works will give you a sense of how it works. More examples will give you a taste of how you can adapt the three main aspects of storage (persistence, serialization, and indexing) to your needs.

Install it (e.g. pip install py2store).

Quick peek

Think of type of storage you want to use and just go ahead, like you're using a dict. Here's an example for local storage (you must you string keys only here).

>>> from py2store import QuickStore
>>>
>>> store = QuickStore()  # will print what (tmp) rootdir it is choosing
>>> # Write something and then read it out again
>>> store['foo'] = 'baz'
>>> 'foo' in store  # do you have the key 'foo' in your store?
True
>>> store['foo']  # what is the value for 'foo'?
'baz'
>>>
>>> # Okay, it behaves like a dict, but go have a look in your file system,  
>>> # and see that there is now a file in the rootdir, named 'foo'!
>>> 
>>> # Write something more complicated
>>> store['hello/world'] = [1, 'flew', {'over': 'a', "cuckoo's": map}]
>>> stored_val = store['hello/world']
>>> stored_val == [1, 'flew', {'over': 'a', "cuckoo's": map}]  # was it retrieved correctly?
True
>>>
>>> # how many items do you have now?
>>> assert len(store) >= 2  # can't be sure there were no elements before, so can't assert == 2
>>> 
>>> # delete the stuff you've written
>>> del store['foo']
>>> del store['hello/world']

QuickStore will by default store things in local files, using pickle as the serializer. If a root directory is not specified, it will use a tmp directory it will create (the first time you try to store something) It will create any directories that need to be created to satisfy any/key/that/contains/slashes. Of course, everything is configurable.

A list of stores for various uses

py2store provides tools to create the dict-like interface to data you need. If you want to just use existing interfaces, build on it, or find examples of how to make such interfaces, check out the ever-growing list of py2store-using projects:

  • mongodol: For MongoDB
  • hear: Read/write audio data flexibly.
  • tabled: Data as pandas.DataFrame from various sources
  • msword: Simple mapping view to docx (Word Doc) elements
  • sshdol: Remote (ssh) files access
  • haggle: Easily search, download, and use kaggle datasets.
  • pyckup: Grab data simply and define protocols for others to do the same.
  • hubcap: Dict-like interface to github.
  • graze: Cache the internet.
  • grub: A ridiculously simple search engine maker.

Just for fun projects:

  • cult: Religious texts search engine. 18mn application of grub.
  • laugh: A (py2store-based) joke finder.

Use cases

Interfacing reads

How many times did someone share some data with you in the form of a zip of some nested folders whose structure and naming choices are fascinatingly obscure? And how much time do you then spend to write code to interface with that freak of nature? Well, one of the intents of py2store is to make that easier to do. You still need to understand the structure of the data store and how to deserialize these datas into python objects you can manipulate. But with the proper tool, you shouldn't have to do much more than that.

Changing where and how things are stored

Ever have to switch where you persist things (say from file system to S3), or change the way key into your data, or the way that data is serialized? If you use py2store tools to separate the different storage concerns, it'll be quite easy to change, since change will be localized. And if you're dealing with code that was already written, with concerns all mixed up, py2store should still be able to help since you'll be able to more easily give the new system a facade that makes it look like the old one.

All of this can also be applied to data bases as well, in-so-far as the CRUD operations you're using are covered by the base methods.

Adapters: When the learning curve is in the way of learning

Shinny new storage mechanisms (DBs etc.) are born constantly, and some folks start using them, and we are eventually lead to use them as well if we need to work with those folks' systems. And though we'd love to learn the wonderful new capabilities the new kid on the block has, sometimes we just don't have time for that.

Wouldn't it be nice if someone wrote an adapter to the new system that had an interface we were familiar with? Talking to SQL as if it were mongo (or visa versa). Talking to S3 as if it were a file system. Now it's not a long term solution: If we're really going to be using the new system intensively, we should learn it. But when you just got to get stuff done, having a familiar facade to something new is a life saver.

py2store would like to make it easier for you roll out an adapter to be able to talk to the new system in the way you are familiar with.

Thinking about storage later, if ever

You have a new project or need to write a new app. You'll need to store stuff and read stuff back. Stuff: Different kinds of resources that your app will need to function. Some people enjoy thinking of how to optimize that aspect. I don't. I'll leave it to the experts to do so when the time comes. Often though, the time is later, if ever. Few proof of concepts and MVPs ever make it to prod.

So instead, I'd like to just get on with the business logic and write my program. So what I need is an easy way to get some minimal storage functionality. But when the time comes to optimize, I shouldn't have to change my code, but instead just change the way my DAO does things. What I need is py2store.

Remove data access entropy

Data comes from many different sources, organization, and formats.

Data is needed in many different contexts, which comes with its own natural data organization and formats.

In between both: A entropic mess of ad-hoc connections and annoying time-consuming and error prone boilerplate.

py2store (and it's now many extensions) is there to mitigate this.

The design gods say SOC, DRY, SOLID* and such. That's good design, yes. But it can take more work to achieve these principles. We'd like to make it easier to do it right than do it wrong.

(*) Separation (Of) Concerns, Don't Repeat Yourself, https://en.wikipedia.org/wiki/SOLID))

We need to determine what are the most common operations we want to do on data, and decide on a common way to express these operations, no matter what the implementation details are.

  • get/read some data
  • set/write some data
  • list/see what data we have
  • filter
  • cache ...

Looking at this, we see that the base operations for complex data systems such as data bases and file systems overlap significantly with the base operations on python (or any programming language) objects.

So we'll reflect this in our choice of a common "language" for these operations. For examples, once projected to a py2store object, iterating over the contents of a data base, or over files, or over the elements of a python (iterable) object should look the same, in code. Achieving this, we achieve SOC, but also set ourselves up for tooling that can assume this consistency, therefore be DRY, and many of the SOLID principles of design.

Also mentionable: So far, py2store core tools are all pure python -- no dependencies on anything else.

Now, when you want to specialize a store (say talk to data bases, web services, acquire special formats (audio, etc.)), then you'll need to pull in a few helpful packages. But the core tooling is pure.

A few words about design

By store we mean key-value store. This could be files in a filesystem, objects in s3, or a database. Where and how the content is stored should be specified, but StoreInterface offers a dict-like interface to this.

__getitem__ calls: _id_of_key			                    _obj_of_data
__setitem__ calls: _id_of_key		        _data_of_obj
__delitem__ calls: _id_of_key
__iter__    calls:	            _key_of_id
>>> from dol import Store

A Store can be instantiated with no arguments. By default it will make a dict and wrap that.

>>> # Default store: no key or value conversion ################################################
>>> s = Store()
>>> s['foo'] = 33
>>> s['bar'] = 65
>>> assert list(s.items()) == [('foo', 33), ('bar', 65)]
>>> assert list(s.store.items()) == [('foo', 33), ('bar', 65)]  # see that the store contains the same thing

Now let's make stores that have a key and value conversion layer input keys will be upper cased, and output keys lower cased input values (assumed int) will be converted to ascii string, and visa versa

>>>
>>> def test_store(s):
...     s['foo'] = 33  # write 33 to 'foo'
...     assert 'foo' in s  # __contains__ works
...     assert 'no_such_key' not in s  # __nin__ works
...     s['bar'] = 65  # write 65 to 'bar'
...     assert len(s) == 2  # there are indeed two elements
...     assert list(s) == ['foo', 'bar']  # these are the keys
...     assert list(s.keys()) == ['foo', 'bar']  # the keys() method works!
...     assert list(s.values()) == [33, 65]  # the values() method works!
...     assert list(s.items()) == [('foo', 33), ('bar', 65)]  # these are the items
...     assert list(s.store.items()) == [('FOO', '!'), ('BAR', 'A')]  # but note the internal representation
...     assert s.get('foo') == 33  # the get method works
...     assert s.get('no_such_key', 'something') == 'something'  # return a default value
...     del(s['foo'])  # you can delete an item given its key
...     assert len(s) == 1  # see, only one item left!
...     assert list(s.items()) == [('bar', 65)]  # here it is
>>>

We can introduce this conversion layer in several ways.

Here are few...

by subclassing

>>> # by subclassing ###############################################################################
>>> class MyStore(Store):
...     def _id_of_key(self, k):
...         return k.upper()
...     def _key_of_id(self, _id):
...         return _id.lower()
...     def _data_of_obj(self, obj):
...         return chr(obj)
...     def _obj_of_data(self, data):
...         return ord(data)
>>> s = MyStore(store=dict())  # note that you don't need to specify dict(), since it's the default
>>> test_store(s)
>>>

by assigning functions to converters

>>> # by assigning functions to converters ##########################################################
>>> class MyStore(Store):
...     def __init__(self, store, _id_of_key, _key_of_id, _data_of_obj, _obj_of_data):
...         super().__init__(store)
...         self._id_of_key = _id_of_key
...         self._key_of_id = _key_of_id
...         self._data_of_obj = _data_of_obj
...         self._obj_of_data = _obj_of_data
...
>>> s = MyStore(dict(),
...             _id_of_key=lambda k: k.upper(),
...             _key_of_id=lambda _id: _id.lower(),
...             _data_of_obj=lambda obj: chr(obj),
...             _obj_of_data=lambda data: ord(data))
>>> test_store(s)
>>>

using a Mixin class

>>> # using a Mixin class #############################################################################
>>> class Mixin:
...     def _id_of_key(self, k):
...         return k.upper()
...     def _key_of_id(self, _id):
...         return _id.lower()
...     def _data_of_obj(self, obj):
...         return chr(obj)
...     def _obj_of_data(self, data):
...         return ord(data)
...
>>> class MyStore(Mixin, Store):  # note that the Mixin must come before Store in the mro
...     pass
...
>>> s = MyStore()  # no dict()? No, because default anyway
>>> test_store(s)

adding wrapper methods to an already made Store instance

>>> # adding wrapper methods to an already made Store instance #########################################
>>> s = Store(dict())
>>> s._id_of_key=lambda k: k.upper()
>>> s._key_of_id=lambda _id: _id.lower()
>>> s._data_of_obj=lambda obj: chr(obj)
>>> s._obj_of_data=lambda data: ord(data)
>>> test_store(s)

Why the name?

  • because it's short
  • because it's cute
  • because it reminds one of "russian dolls" (one way to think of wrappers)
  • because we can come up with an acronym the contains "Data Object" in it.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dol-0.1.39.tar.gz (127.1 kB view hashes)

Uploaded source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page