Skip to main content

Python Client for Google Cloud Datastore

Project description

This is a shared codebase for gcloud-rest-datastore and gcloud-rest-datastore

Latest PyPI Version (gcloud-rest-datastore) Python Version Support (gcloud-rest-datastore) Python Version Support (gcloud-rest-datastore)

Installation

$ pip install --upgrade gcloud-{aio,rest}-datastore

Usage

We’re still working on documentation; for now, this should help get you started:

from gcloud.rest.datastore import Datastore
from gcloud.rest.datastore import Direction
from gcloud.rest.datastore import Filter
from gcloud.rest.datastore import GQLQuery
from gcloud.rest.datastore import Key
from gcloud.rest.datastore import PathElement
from gcloud.rest.datastore import PropertyFilter
from gcloud.rest.datastore import PropertyFilterOperator
from gcloud.rest.datastore import PropertyOrder
from gcloud.rest.datastore import Query
from gcloud.rest.datastore import Value

ds = Datastore('my-gcloud-project', '/path/to/creds.json')
key1 = Key('my-gcloud-project', [PathElement('Kind', 'entityname')])
key2 = Key('my-gcloud-project', [PathElement('Kind', 'entityname2')])

# batched lookups
entities = await ds.lookup([key1, key2])

# convenience functions for any datastore mutations
await ds.insert(key1, {'a_boolean': True, 'meaning_of_life': 41})
await ds.update(key1, {'a_boolean': True, 'meaning_of_life': 42})
await ds.upsert(key1, {'animal': 'aardvark'})
await ds.delete(key1)

# or build your own mutation sequences with full transaction support
transaction = await ds.beginTransaction()
try:
    mutations = [
        ds.make_mutation(Operation.INSERT, key1, properties={'animal': 'sloth'}),
        ds.make_mutation(Operation.UPSERT, key1, properties={'animal': 'aardvark'}),
        ds.make_mutation(Operation.INSERT, key2, properties={'animal': 'aardvark'}),
    ]
    await ds.commit(transaction, mutations=mutations)
except Exception:
    await ds.rollback(transaction)

# support for partial keys
partial_key = Key('my-gcloud-project', [PathElement('Kind')])
# and ID allocation or reservation
allocated_keys = await ds.allocateIds([partial_key])
await ds.reserveIds(allocated_keys)

# query support
property_filter = PropertyFilter(prop='answer',
                                 operator=PropertyFilterOperator.EQUAL,
                                 value=Value(42))
property_order = PropertyOrder(prop='length',
                               direction=Direction.DESCENDING)
query = Query(kind='the_meaning_of_life',
              query_filter=Filter(property_filter),
              order=property_order)
results = await ds.runQuery(query, session=s)

# alternatively, query support using GQL
gql_query = GQLQuery('SELECT * FROM the_meaning_of_life WHERE answer = @answer',
                     named_bindings={'answer': 42})
results = await ds.runQuery(gql_query, session=s)

# close the HTTP session
# Note that other options include:
# * providing your own session: ``Datastore(.., session=session)``
# * using a context manager: ``async with Datastore(..) as ds:``
await ds.close()

Custom Subclasses

gcloud-rest-datastore provides class interfaces mirroring all official Google API types, ie. Key and PathElement, Entity and EntityResult, QueryResultBatch, and Value. These types will be returned from arbitrary Datastore operations, for example Datastore.allocateIds(...) will return a list of Key entities.

For advanced usage, all of these datatypes may be overloaded. A common use-case may be to deserialize entities into more specific classes. For example, given a custom entity class such as:

class MyEntityKind(gcloud.rest.datastore.Entity):
    def __init__(self, key, properties = None) -> None:
        self.key = key
        self.is_an_aardvark = (properties or {}).get('aardvark', False)

    def __repr__(self):
        return "I'm an aardvark!" if self.is_an_aardvark else "Sorry, nope"

We can then configure gcloud-rest-datastore to serialize/deserialize from this custom entity class with:

class MyCustomDatastore(gcloud.rest.datastore.Datastore):
    entity_result_kind.entity_kind = MyEntityKind

The full list of classes which may be overridden in this way is:

class MyVeryCustomDatastore(gcloud.rest.datastore.Datastore):
    datastore_operation_kind = DatastoreOperation
    entity_result_kind = EntityResult
    entity_result_kind.entity_kind = Entity
    entity_result_kind.entity_kind.key_kind = Key
    key_kind = Key
    key_kind.path_element_kind = PathElement
    mutation_result_kind = MutationResult
    mutation_result_kind.key_kind = Key
    query_result_batch_kind = QueryResultBatch
    query_result_batch_kind.entity_result_kind = EntityResult
    value_kind = Value
    value_kind.key_kind = Key

class MyVeryCustomQuery(gcloud.rest.datastore.Query):
    value_kind = Value

class MyVeryCustomGQLQuery(gcloud.rest.datastore.GQLQuery):
    value_kind = Value

You can then drop-in the MyVeryCustomDatastore class anywhere where you previously used Datastore and do the same for Query and GQLQuery.

To override any sub-key, you’ll need to override any parents which use it. For example, if you want to use a custom Key kind and be able to use queries with it, you will need to implement your own Value, Query, and GQLQuery classes and wire them up to the rest of the custom classes:

class MyKey(gcloud.rest.datastore.Key):
    pass

class MyValue(gcloud.rest.datastore.Value):
    key_kind = MyKey

class MyEntity(gcloud.rest.datastore.Entity):
    key_kind = MyKey
    value_kind = MyValue

class MyEntityResult(gcloud.rest.datastore.EntityResult):
    entity_kind = MyEntity

class MyQueryResultBatch(gcloud.rest.datastore.QueryResultBatch):
    entity_result_kind = MyEntityResult

class MyDatastore(gcloud.rest.datastore.Datastore):
    key_kind = MyKey
    entity_result_kind = MyEntityResult
    query_result_batch = MyQueryResultBatch
    value_kind = MyValue

class MyQuery(gcloud.rest.datastore.Query):
    value_kind = MyValue

class MyGQLQuery(gcloud.rest.datastore.GQLQuery):
    value_kind = MyValue

Contributing

Please see our contributing guide.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gcloud-rest-datastore-7.0.0.tar.gz (15.7 kB view details)

Uploaded Source

Built Distribution

gcloud_rest_datastore-7.0.0-py2.py3-none-any.whl (20.3 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file gcloud-rest-datastore-7.0.0.tar.gz.

File metadata

  • Download URL: gcloud-rest-datastore-7.0.0.tar.gz
  • Upload date:
  • Size: 15.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.13 CPython/3.9.12 Linux/5.13.0-1017-aws

File hashes

Hashes for gcloud-rest-datastore-7.0.0.tar.gz
Algorithm Hash digest
SHA256 f69b5fe9153f06994268fc5e6aa37963ccac98ffa039bdbdac85aa7f1872e78b
MD5 f8dd83167321387ebeecebae671e4d58
BLAKE2b-256 389b110b84d3fd6b9fa763736125f4e1baed70af7a9d5c25ed173e01d764d59a

See more details on using hashes here.

File details

Details for the file gcloud_rest_datastore-7.0.0-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for gcloud_rest_datastore-7.0.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 f21b95d18cf369e8c3fc3b6dad979031465653dc9c7877dc3779a9404295ef1c
MD5 d42c62db7b74de494a3016fd09436fc6
BLAKE2b-256 c1d911ed178ae22d20bb164a832cf4f4d82a1f8fdf78a6a5a4a24b906f5f35cb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page