Query Engine API for Distributed AtomSpace
Project description
Hyperon DAS
Hi! This package is a query engine API for Distributed AtomSpace (DAS). When is possible execute queries using Pattern Matcher
Table of Contents
Installation
Before you start, make sure you have Python >= 3.8.5 and Pip installed on your system.
You can install and run this project using different methods. Choose the one that suits your needs.
Using-pip
Run the following command to install the project using pip::
pip install hyperon-das
Using-Poetry
If you prefer to manage your Python projects with Poetry, follow these steps:
-
Install Poetry (if you haven't already):
pip install poetry
-
Clone the project repository:
git clone git@github.com:singnet/das-query-engine.git cd das-query-engine
-
Install project dependencies using Poetry:
poetry install
-
Activate the virtual environment created by Poetry:
poetry shell
Now you can run the project within the Poetry virtual environment.
Usage
So far we have two ways of making queries using the API. One that uses database persistence and another that doesn't. The way to create and execute the query is exactly the same, the only difference is when you need to instantiate the API class. Below you can see more details.
Redis and MongoDB
If you want to use data persistence, you must have Redis and MongoDB running in your environment and you must have the following variables configured with their respective values:
Example:
DAS_MONGODB_HOSTNAME=172.17.0.2
DAS_MONGODB_PORT=27017
DAS_MONGODB_USERNAME=mongo
DAS_MONGODB_PASSWORD=mongo
DAS_REDIS_HOSTNAME=127.0.0.1
DAS_REDIS_PORT=6379
TIP: You can change the values in the environment file, which is in the root directory and run the command below:
source environment
Create a client API
from hyperon_das import DistributedAtomSpace
api = DistributedAtomSpace('redis_mongo')
In Memory
This way you don't need anything just instantiate the class as shown below:
-
A simple query which is a
AND
operation on two links whose targets are variables.from hyperon_das import DistributedAtomSpace from hyperon_das.pattern_matcher import And, Variable, Link from hyperon_das.utils import QueryOutputFormat api = DistributedAtomSpace('ram_only') api.add_link({ 'type': 'Evaluation', 'targets': [ {'type': 'Predicate', 'name': 'Predicate:has_name'}, { 'type': 'Evaluation', 'targets': [ {'type': 'Predicate', 'name': 'Predicate:has_name'}, { 'type': 'Set', 'targets': [ {'type': 'Reactome', 'name': 'Reactome:R-HSA-164843'}, {'type': 'Concept', 'name': 'Concept:2-LTR circle formation'}, ] }, ], }, ], }) expression = Link("Evaluation", ordered=True, targets=[Variable("V1"), Variable("V2")]) resp = api.pattern_matcher_query(expression, {'return_type': QueryOutputFormat.JSON, 'toplevel_only': True}) print(resp)
[ { "V1": { "type": "Predicate", "name": "Predicate:has_name", "is_link": false, "is_node": true }, "V2": { "type": "Evaluation", "targets": [ { "type": "Predicate", "name": "Predicate:has_name" }, { "type": "Set", "targets": [ { "type": "Reactome", "name": "Reactome:R-HSA-164843" }, { "type": "Concept", "name": "Concept:2-LTR circle formation" } ] } ], "is_link": true, "is_node": false } } ]
-
Add Node and And Link (It's possible only using Ram Only)
api.count_atoms() # (0, 0) nodes = [ { 'type': 'Reactome', 'name': 'Reactome:R-HSA-164843', }, { 'type': 'Concept', 'name': 'Concept:2-LTR circle formation', } ] for node in nodes: api.add_node(node) api.count_atoms() # (2, 0) link = { 'type': 'Evaluation', 'targets': [ { 'type': 'Predicate', 'name': 'Predicate:has_name' }, { 'type': 'Evaluation', 'targets': [ { 'type': 'Predicate', 'name': 'Predicate:has_name' }, { 'type': 'Set', 'targets': [ { 'type': 'Reactome', 'name': 'Reactome:R-HSA-164843', }, { 'type': 'Concept', 'name': 'Concept:2-LTR circle formation', }, ] }, ], }, ], } api.add_link(link) api.count_atoms() # (3, 3)
Note1: in this example I add 2 nodes and 1 a link, but in the end I have 3 nodes and 3 links. Therefore, it is possible to add nested links and as links are composed of nodes, if the link node doesn't exist in the system it's added.
Note2: For these methods to work well, both nodes and links must be a dict with the structure shown above, i.e, for nodes you need to send, at least, the parameters
type
andname
and for linkstype
andtargets
Tests
You can run the command below to run the unit tests
make test-coverage
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file hyperon_das-0.3.3.tar.gz
.
File metadata
- Download URL: hyperon_das-0.3.3.tar.gz
- Upload date:
- Size: 20.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.5.1 CPython/3.11.5 Linux/6.5.8-arch1-1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 76bae8e8531735e3c4e4f122bc8ce85fbd2003390ca63d4540492d723c90d3a2 |
|
MD5 | eacaa638be0f425b6cf06018a8581e9e |
|
BLAKE2b-256 | eac2b156bc8a1d94986ef2f2e764f801303807f59ca9f1cddb6a467d2a28f98d |
File details
Details for the file hyperon_das-0.3.3-py3-none-any.whl
.
File metadata
- Download URL: hyperon_das-0.3.3-py3-none-any.whl
- Upload date:
- Size: 22.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.5.1 CPython/3.11.5 Linux/6.5.8-arch1-1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6fb53b302f2b5981f701f546161ab139237d05d88c3bb0211e1eb8ca5c957f59 |
|
MD5 | a2552e1817256f43f0b3180f5d668550 |
|
BLAKE2b-256 | bf053a3fed0b09290bdb08a8d7e939ea7ea701711d04d5bb7fe711874593104d |