Skip to main content

Python Client for Google Cloud Datastore

Project description

This is a shared codebase for gcloud-rest-datastore and gcloud-rest-datastore

Latest PyPI Version (gcloud-rest-datastore) Python Version Support (gcloud-rest-datastore) Python Version Support (gcloud-rest-datastore)

Installation

$ pip install --upgrade gcloud-{aio,rest}-datastore

Usage

We’re still working on documentation; for now, this should help get you started:

from gcloud.rest.datastore import Datastore
from gcloud.rest.datastore import Direction
from gcloud.rest.datastore import Filter
from gcloud.rest.datastore import GQLQuery
from gcloud.rest.datastore import Key
from gcloud.rest.datastore import PathElement
from gcloud.rest.datastore import PropertyFilter
from gcloud.rest.datastore import PropertyFilterOperator
from gcloud.rest.datastore import PropertyOrder
from gcloud.rest.datastore import Query
from gcloud.rest.datastore import Value

ds = Datastore('my-gcloud-project', '/path/to/creds.json')
key1 = Key('my-gcloud-project', [PathElement('Kind', 'entityname')])
key2 = Key('my-gcloud-project', [PathElement('Kind', 'entityname2')])

# batched lookups
entities = await ds.lookup([key1, key2])

# convenience functions for any datastore mutations
await ds.insert(key1, {'a_boolean': True, 'meaning_of_life': 41})
await ds.update(key1, {'a_boolean': True, 'meaning_of_life': 42})
await ds.upsert(key1, {'animal': 'aardvark'})
await ds.delete(key1)

# or build your own mutation sequences with full transaction support
transaction = await ds.beginTransaction()
try:
    mutations = [
        ds.make_mutation(Operation.INSERT, key1, properties={'animal': 'sloth'}),
        ds.make_mutation(Operation.UPSERT, key1, properties={'animal': 'aardvark'}),
        ds.make_mutation(Operation.INSERT, key2, properties={'animal': 'aardvark'}),
    ]
    await ds.commit(transaction, mutations=mutations)
except Exception:
    await ds.rollback(transaction)

# support for partial keys
partial_key = Key('my-gcloud-project', [PathElement('Kind')])
# and ID allocation or reservation
allocated_keys = await ds.allocateIds([partial_key])
await ds.reserveIds(allocated_keys)

# query support
property_filter = PropertyFilter(prop='answer',
                                 operator=PropertyFilterOperator.EQUAL,
                                 value=Value(42))
property_order = PropertyOrder(prop='length',
                               direction=Direction.DESCENDING)
query = Query(kind='the_meaning_of_life',
              query_filter=Filter(property_filter),
              order=property_order)
results = await ds.runQuery(query, session=s)

# alternatively, query support using GQL
gql_query = GQLQuery('SELECT * FROM the_meaning_of_life WHERE answer = @answer',
                     named_bindings={'answer': 42})
results = await ds.runQuery(gql_query, session=s)

# close the HTTP session
# Note that other options include:
# * providing your own session: ``Datastore(.., session=session)``
# * using a context manager: ``async with Datastore(..) as ds:``
await ds.close()

Custom Subclasses

gcloud-rest-datastore provides class interfaces mirroring all official Google API types, ie. Key and PathElement, Entity and EntityResult, QueryResultBatch, and Value. These types will be returned from arbitrary Datastore operations, for example Datastore.allocateIds(...) will return a list of Key entities.

For advanced usage, all of these datatypes may be overloaded. A common use-case may be to deserialize entities into more specific classes. For example, given a custom entity class such as:

class MyEntityKind(gcloud.rest.datastore.Entity):
    def __init__(self, key, properties = None) -> None:
        self.key = key
        self.is_an_aardvark = (properties or {}).get('aardvark', False)

    def __repr__(self):
        return "I'm an aardvark!" if self.is_an_aardvark else "Sorry, nope"

We can then configure gcloud-rest-datastore to serialize/deserialize from this custom entity class with:

class MyCustomDatastore(gcloud.rest.datastore.Datastore):
    entity_result_kind.entity_kind = MyEntityKind

The full list of classes which may be overridden in this way is:

class MyVeryCustomDatastore(gcloud.rest.datastore.Datastore):
    datastore_operation_kind = DatastoreOperation
    entity_result_kind = EntityResult
    entity_result_kind.entity_kind = Entity
    entity_result_kind.entity_kind.key_kind = Key
    key_kind = Key
    key_kind.path_element_kind = PathElement
    mutation_result_kind = MutationResult
    mutation_result_kind.key_kind = Key
    query_result_batch_kind = QueryResultBatch
    query_result_batch_kind.entity_result_kind = EntityResult
    value_kind = Value
    value_kind.key_kind = Key

class MyVeryCustomQuery(gcloud.rest.datastore.Query):
    value_kind = Value

class MyVeryCustomGQLQuery(gcloud.rest.datastore.GQLQuery):
    value_kind = Value

You can then drop-in the MyVeryCustomDatastore class anywhere where you previously used Datastore and do the same for Query and GQLQuery.

To override any sub-key, you’ll need to override any parents which use it. For example, if you want to use a custom Key kind and be able to use queries with it, you will need to implement your own Value, Query, and GQLQuery classes and wire them up to the rest of the custom classes:

class MyKey(gcloud.rest.datastore.Key):
    pass

class MyValue(gcloud.rest.datastore.Value):
    key_kind = MyKey

class MyEntity(gcloud.rest.datastore.Entity):
    key_kind = MyKey
    value_kind = MyValue

class MyEntityResult(gcloud.rest.datastore.EntityResult):
    entity_kind = MyEntity

class MyQueryResultBatch(gcloud.rest.datastore.QueryResultBatch):
    entity_result_kind = MyEntityResult

class MyDatastore(gcloud.rest.datastore.Datastore):
    key_kind = MyKey
    entity_result_kind = MyEntityResult
    query_result_batch = MyQueryResultBatch
    value_kind = MyValue

class MyQuery(gcloud.rest.datastore.Query):
    value_kind = MyValue

class MyGQLQuery(gcloud.rest.datastore.GQLQuery):
    value_kind = MyValue

Contributing

Please see our contributing guide.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gcloud-rest-datastore-7.1.0.tar.gz (15.9 kB view details)

Uploaded Source

Built Distribution

gcloud_rest_datastore-7.1.0-py2.py3-none-any.whl (20.4 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file gcloud-rest-datastore-7.1.0.tar.gz.

File metadata

  • Download URL: gcloud-rest-datastore-7.1.0.tar.gz
  • Upload date:
  • Size: 15.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.13 CPython/3.9.12 Linux/5.13.0-1017-aws

File hashes

Hashes for gcloud-rest-datastore-7.1.0.tar.gz
Algorithm Hash digest
SHA256 3fe5ddc36d541d38048fb4a809696f2c6fc48f19b4e0c7b3aadb9fda245d9f68
MD5 ff193cf26ff92029c9ed062651ccd6ba
BLAKE2b-256 5fad662a6b3f5b2b8047c047c87656345136a7d070be2004dfe8d94dcb9107de

See more details on using hashes here.

File details

Details for the file gcloud_rest_datastore-7.1.0-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for gcloud_rest_datastore-7.1.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 46b575e2db6af41aff13bbeed9e6c52d220ad66f948bf9b820fdfccec57858c2
MD5 6d85e502678d769f0e296ee61e1d9d5d
BLAKE2b-256 08c3fe043c9d3dd87554211f593109b74896fffbe607edc91ec726b376c0c0e7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page