Skip to main content

Query Engine API for Distributed AtomSpace

Project description

Hyperon DAS

Hi! This package is a query engine API for Distributed AtomSpace (DAS). When is possible execute queries using Pattern Matcher

Table of Contents

Installation

Before you start, make sure you have Python >= 3.8.5 and Pip installed on your system.

You can install and run this project using different methods. Choose the one that suits your needs.

Using-pip

Run the following command to install the project using pip::

pip install hyperon-das

Using-Poetry

If you prefer to manage your Python projects with Poetry, follow these steps:

  1. Install Poetry (if you haven't already):

    pip install poetry
    
  2. Clone the project repository:

    git clone git@github.com:singnet/das-query-engine.git
    cd das-query-engine
    
  3. Install project dependencies using Poetry:

    poetry install
    
  4. Activate the virtual environment created by Poetry:

    poetry shell
    

Now you can run the project within the Poetry virtual environment.

Pre-Commit Setup

Before pushing your changes, it's recommended to set up pre-commit to run automated tests locally. Run the following command (needs to be done once):

pre-commit install

Usage

You can instantiate DAS in three different ways. see below:

Local

This is a local instance of DAS with default settings.

from hyperon_das import DistributedAtomSpace

das = DistributedAtomSpace()

Remote

To create a remote DAS, you need to specify the 'query_engine' parameter as 'remote' and pass the machine, 'host' and 'port' parameters. See below how to do this:

from hyperon_das import DistributedAtomSpace

das = DistributedAtomSpace(query_engine='remote', host='0.0.0.0', port=1234)

Server

To create a DAS server, you will need to specify the 'atomdb' parameter as 'redis_mongo' and pass the database parameters. The databases supported in this release are Redis and MongoDB. Therefore, the minimum expected parameters are:

  • mongo_hostname
  • mongo_port
  • mongo_username
  • mongo_password
  • redis_hostname
  • redis_port

but it is possible to pass other configuration parameters:

  • mongo_tls_ca_file
  • redis_username
  • redis_password
  • redis_cluster
  • redis_ssl
from hyperon_das import DistributedAtomSpace

das = DistributedAtomSpace(
    atomdb='redis_mongo',
    mongo_hostname='127.0.0.2',
    mongo_port=27017,
    mongo_username='mongo',
    mongo_password='mongo',
    redis_hostname='127.0.0.1',
    redis_port=6379
)

TraverseEngine

Introducing TraverseEngine! This API functionality can process some requests related to hypergraph traversal. In other words, it allows a given Atom to travel to it's neighborhood through adjacent Links.

Creating a TraverseEngine object

To create a TraverseEngine object, use the get_traversal_cursor method, which expects a handle as a starting point for the traversal.

Optionally, you can provide kwargs parameters: - handles_only is a bool and dafaults is False. If True, get methods in TraverseEngine return handles only.

Example:

from hyperon_das import DistributedAtomSpace

das = DistributedAtomSpace()

traverse_engine = das.get_traversal_cursor(handle='12345', handles_only=True)

Traversal Methods

The TraverseEngine provides some methods for graph traversal:

  1. get(): Return the current cursor.
  2. get_links(kwargs): Return any links having current cursor as one of its targets, i.e. any links pointing to cursor.
  3. get_neighbors(kwargs): Returns the set formed by all targets of all links that point to the current cursor. In other words, the set of “neighbors” of the current cursor.
  4. follow_link(kwargs): Updates the current cursor by following a link and selecting one of its targets, randomly.
  5. goto(handle): Reset the current cursor to the passed handle.

Parameters for Traversal Methods

Various parameters can be passed to the traversal methods to filter the results. For example:

  1. link_type=XXX: Filters to contain only links whose named_type == XXX.
  2. cursor_position=N: filters the response so that only links with the current cursor at the nth position of their target are returned. (only available in the get_links method)
  3. target_type=XXX: Filters to contain only links whose at least one of the targets has named_type == XXX.
  4. unique_path=FLAG: if FLAG is True, raise an exception if there's more then one possible neighbor to select after applying all filters. (only available in the follow_link method)
  5. filter=F: F is a function used to filter results after every other filters have been applied. F should expect a dict (the atom document) and return True if and only if this atom should be kept. (only available when the TraverseEngine object is created by passing the handles_only=False parameter)

Examples

Local DAS

This way it is only possible to make queries in your local memory with the Atoms that you placed in the DAS. See bellow:

from hyperon_das import DistributedAtomSpace

das = DistributedAtomSpace()

das.count_atoms() # (0, 0)

das.add_link({
    'type': 'Inheritance',
    'targets': [
        {'type': 'Concept', 'name': 'human'},
        {'type': 'Concept', 'name': 'mammal'}
    ],
})

das.add_link({
    'type': 'Inheritance',
    'targets': [
        {'type': 'Concept', 'name': 'monkey'},
        {'type': 'Concept', 'name': 'mammal'}
    ]
})

das.count_atoms() # (3, 2)

query = {
    'atom_type': 'link',
    'type': 'Inheritance',
    'targets': [
        {'atom_type': 'variable', 'name': 'v1'},
        {'atom_type': 'node', 'type': 'Concept', 'name': 'mammal'},
    ]
}

query_params = {
    "toplevel_only": False
}

resp = das.query(query, query_params)

print(resp)
[
    {
        "handle": "c93e1e758c53912638438e2a7d7f7b7f",
        "type": "Inheritance",
        "template": ["Inheritance", "Concept", "Concept"],
        "targets": [
            {
                "handle": "af12f10f9ae2002a1607ba0b47ba8407",
                "type": "Concept",
                "name": "human",
            },
            {
                "handle": "bdfe4e7a431f73386f37c6448afe5840",
                "type": "Concept",
                "name": "mammal",
            },
        ],
    },
    {
        "handle": "f31dfe97db782e8cec26de18dddf8965",
        "type": "Inheritance",
        "template": ["Inheritance", "Concept", "Concept"],
        "targets": [
            {
                "handle": "1cdffc6b0b89ff41d68bec237481d1e1",
                "type": "Concept",
                "name": "monkey",
            },
            {
                "handle": "bdfe4e7a431f73386f37c6448afe5840",
                "type": "Concept",
                "name": "mammal",
            },
        ],
    },
]

Remote DAS

This way it'ss possible to make queries in your local memory and on the remote machine that you need to pass during the creation of DAS instance. See below:

from hyperon_das import DistributedAtomSpace

das = DistributedAtomSpace(query_engine='remote', host='192.32.11.45', port=9000)

In the query method is possible pass query_scope parameter with four available values. This specifying whether you want to make the query local, remote, local and remote or synchronize and remote. If you don't pass the default is "remote_only"

  1. "local_only"
  • Only local query
  1. "remote_only"
  • Only remote query
  1. "local_and_remote"
  • This type is not available yet. So, this will raise an exception
  1. "synchronous_update"
  • Before make query it will commit your local changes and then make the remote query

Local Scope

query = {
    'atom_type': 'link',
    'type': 'Inheritance',
    'targets': [
        {'atom_type': 'node', 'type': 'Concept', 'name': 'humana'},
        {'atom_type': 'node', 'type': 'Concept', 'name': 'mammala'},
    ]
}

query_params = {
    "toplevel_only": False,
    "query_scope": "local_only"
}

resp = das.query(query, query_params)
print(resp)
[]

Remote scope

query = {
    'atom_type': 'link',
    'type': 'Inheritance',
    'targets': [
        {'atom_type': 'node', 'type': 'Concept', 'name': 'human'},
        {'atom_type': 'node', 'type': 'Concept', 'name': 'mammal'},
    ]
}

query_params = {
    "toplevel_only": False
}

resp = das.query(query, query_params)

print(resp)
[
    {
        "handle": "c93e1e758c53912638438e2a7d7f7b7f",
        "type": "Inheritance",
        "template": ["Inheritance", "Concept", "Concept"],
        "targets": [
            {
                "handle": "af12f10f9ae2002a1607ba0b47ba8407",
                "type": "Concept",
                "name": "human",
            },
            {
                "handle": "bdfe4e7a431f73386f37c6448afe5840",
                "type": "Concept",
                "name": "mammal",
            },
        ],
    }
]

Remote scope synchronized with local Atoms

# Add a local Link
das.add_link({
    'type': 'Inheritance',
    'targets': [
        {'type': 'Concept', 'name': 'monkey'},
        {'type': 'Concept', 'name': 'mammal'}
    ]
})

query = {
    'atom_type': 'link',
    'type': 'Inheritance',
    'targets': [
        {'atom_type': 'node', 'type': 'Concept', 'name': 'human'},
        {'atom_type': 'node', 'type': 'Concept', 'name': 'mammal'},
    ]
}

query_params = {
    "toplevel_only": False,
    "query_scope": "synchronous_update"
}

resp = das.query(query, query_params)

print(resp)
[
    {
        "handle": "c93e1e758c53912638438e2a7d7f7b7f",
        "type": "Inheritance",
        "template": ["Inheritance", "Concept", "Concept"],
        "targets": [
            {
                "handle": "af12f10f9ae2002a1607ba0b47ba8407",
                "type": "Concept",
                "name": "human",
            },
            {
                "handle": "bdfe4e7a431f73386f37c6448afe5840",
                "type": "Concept",
                "name": "mammal",
            },
        ],
    },
    {
        "handle": "f31dfe97db782e8cec26de18dddf8965",
        "type": "Inheritance",
        "template": ["Inheritance", "Concept", "Concept"],
        "targets": [
            {
                "handle": "1cdffc6b0b89ff41d68bec237481d1e1",
                "type": "Concept",
                "name": "monkey",
            },
            {
                "handle": "bdfe4e7a431f73386f37c6448afe5840",
                "type": "Concept",
                "name": "mammal",
            },
        ],
    },
]

Traverse

from hyperon_das import DistributedAtomSpace

das = DistributedAtomSpace(query_engine='remote', host='192.32.11.45', port=9000)

traverse = das.get_traversal_cursor(handle='12345', handles_only=True)


current_cursor = traverse.get()

links = traverse.get_links(link_type='Similarity', cursor_position=0, target_type='Concept')

neighbors = traverse.get_neighbors(link_type='Inheritance', target_type='Concept')

traverse.follow_link(link_type='Similarity', unique_path=True)

traverse.goto(handle='9990')

Tests

You can run the command below to run the unit tests

make test-unit

Release Notes

DAS Query Engine Releases

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hyperon_das-0.7.1.tar.gz (29.9 kB view details)

Uploaded Source

Built Distribution

hyperon_das-0.7.1-py3-none-any.whl (31.0 kB view details)

Uploaded Python 3

File details

Details for the file hyperon_das-0.7.1.tar.gz.

File metadata

  • Download URL: hyperon_das-0.7.1.tar.gz
  • Upload date:
  • Size: 29.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.2 Linux/6.5.0-1016-azure

File hashes

Hashes for hyperon_das-0.7.1.tar.gz
Algorithm Hash digest
SHA256 4b1e110d567f42e0d4651e2a2067b599f3e9c44d94bba00e4b7df53193ac1ae9
MD5 ffcdee9601367bba297ab30b5e148df1
BLAKE2b-256 5b03bb5df5eb22ea6b72990076c4ded13959d2b82499adc96691bf054b73df6f

See more details on using hashes here.

File details

Details for the file hyperon_das-0.7.1-py3-none-any.whl.

File metadata

  • Download URL: hyperon_das-0.7.1-py3-none-any.whl
  • Upload date:
  • Size: 31.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.12.2 Linux/6.5.0-1016-azure

File hashes

Hashes for hyperon_das-0.7.1-py3-none-any.whl
Algorithm Hash digest
SHA256 1d6263dfc29631b3187bd64a29e6f40571252918065b0517c27302e1a064ffd8
MD5 054040114e481802945bf52a8e8709b5
BLAKE2b-256 1625c613ec889c897237fee3962316443261e2fdef5bd9e2acc497bc83bc1afc

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page