Skip to main content

Query proxy that allows the usage of AtlasSearch with mongoengine specific syntax

Project description

AtlasQ

AtlasQ allows the usage of AtlasSearch keeping the MongoEngine syntax.

Structure

The package tries to follow the MongoEngine structure; the major differences reside in the transform.py and queryset.py files.

Transform

Like in MongoEngine, a step in the pipeline is the creation of a query from a Q object: we have to find a correspondence between the MongoEngine common syntax and what AtlasSearch allows. For doing this, we had to find some compromises.

Not every keyword is supported at the moment: if you have an actual use case that you would like to support, please be free to open an issue or a PR at any moment.

QuerySet

There are probably a thousand of better implementation, if you actually knew MongoEngine and above all PyMongo. Unfortunately, our knowledge is limited, so here we go. If you find a solution that works better, again, feel free to open an issue or a PR.

The main idea, is that the filter should work like an aggregation. For doing so, and with keeping the compatibility on how MongoEngine works (i.e. the filter should return a queryset of Document) we had to do some work.
Calling .aggregate instead has to work as MongoEngine expect, meaning a list of dictionaries.

Usage

Now the most important part: how do you use this package?

from mongoengine import Document, fields

from atlasq import AtlasManager, AtlasQ, AtlasQuerySet

index_name = str("my_index")

class MyDocument(Document):
    name = fields.StringField(required=True)
    surname = fields.StringField(required=True)
    atlas = AtlasManager(index_name)

obj = MyDocument.objects.create(name="value", surname="value2")

qs = MyDocument.atlas.filter(name="value")
assert isinstance(qs, AtlasQuerySet)
obj_from_atlas = qs.first()
assert obj == obj_from_atlas

obj2_from_atlas = MyDocument.atlas.get(AtlasQ(name="value") & AtlasQ(surname="value2"))
assert obj == obj2_from_atlas


obj3_from_atlas = MyDocument.atlas.get(AtlasQ(wrong_field="value"))
assert obj3_from_atlas is None

Extended Features

Validation

We also decided to have, optionally, a validation of the index. Two things are checked:

  • The index actually exists (If you query a non-existing index, Atlas as default behaviour will not raise any error).
  • The fields that you are querying are actually indexed(If you query a field that is not indexed, Atlas as default behaviour will not raise any error, and will return an empty list). To make these check, you need to call the function ensure_index on the queryset.
from mongoengine import Document, fields

from atlasq import AtlasManager, AtlasQ

index_name = str("my_index")

class MyDocument(Document):
    name = fields.StringField(required=True)
    surname = fields.StringField(required=True)
    atlas = AtlasManager(index_name)

result = MyDocument.atlas.ensure_index("user", "pwd", "group", "cluster")
assert result is True
obj1_from_atlas = MyDocument.atlas.get(AtlasQ(name="value")) 
obj2_from_atlas = MyDocument.atlas.get(AtlasQ(wrong_field="value")) # raises AtlasIndexFieldError

EmbeddedDocuments

Embedded documents are queried in two different ways, depending on how you created your Search Index. Remember to ensure the index so that AtlasQ can know how your index is defined If you used the embeddedDocuments type, AtlasQ will use the embeddedDocument query syntax. Otherwise, if you used the document type, or you did not ensure the index, a normal text search with the . syntax will be used.

Upload index

It is possible to upload directly the Search index using AtlasQ, calling the function upload_index on the queryset. Syntax checks on the index itself are performed. If the _id is not present but pk or id was specified, it will be automatically added, allowing valid text query on the primary key.

from mongoengine import Document, fields

from atlasq import AtlasManager

index_name = str("my_index")
index = {
  "analyzer": "lucene.keyword",
  "mappings": {
    "dynamic": False,
    "fields": {
      "_id": {
        "type": "objectId"
      },
        "name": {
        "type": "string"
      },
      "surname": {
        "type": "string"
      },
    }
  }
}
class MyDocument(Document):
    name = fields.StringField(required=True)
    surname = fields.StringField(required=True)
    atlas = AtlasManager(index_name)

result = MyDocument.atlas.ensure_index("user", "pwd", "group", "cluster")
assert result is False
MyDocument.atlas.upload_index(index, "user", "pwd", "group", "cluster")
result = MyDocument.atlas.ensure_index("user", "pwd", "group", "cluster")
assert result is True

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

atlasq-0.11.1.tar.gz (37.9 kB view details)

Uploaded Source

Built Distribution

atlasq-0.11.1-py3-none-any.whl (15.4 kB view details)

Uploaded Python 3

File details

Details for the file atlasq-0.11.1.tar.gz.

File metadata

  • Download URL: atlasq-0.11.1.tar.gz
  • Upload date:
  • Size: 37.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.3

File hashes

Hashes for atlasq-0.11.1.tar.gz
Algorithm Hash digest
SHA256 46d7b670b6b1ef8969dc91a7f610c636a4ec87cc21888d50a7e7339c6ec41896
MD5 bba4790c2274d492f19816f3a710741e
BLAKE2b-256 313b007bacbec74f0aa8816dde9706c75c7224f653c521ee0ffdb6ed727aef33

See more details on using hashes here.

File details

Details for the file atlasq-0.11.1-py3-none-any.whl.

File metadata

  • Download URL: atlasq-0.11.1-py3-none-any.whl
  • Upload date:
  • Size: 15.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.3

File hashes

Hashes for atlasq-0.11.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ded6f9f4f70895001941e8567056b6b46abb14ec8ef279c7ec3c1123d6373deb
MD5 07ed3b7c9dceb55e4fa4475564058a9b
BLAKE2b-256 af4c8b002f0695241b0e1677da3f6569b833219dc3af92dac32ffaeac2e34fab

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page