Query Engine API for Distributed AtomSpace
Project description
Hyperon DAS
Hi! This package is a query engine API for Distributed AtomSpace (DAS). When is possible execute queries using Pattern Matcher
Table of Contents
Installation
Before you start, make sure you have Python >= 3.8.5 and Pip installed on your system.
You can install and run this project using different methods. Choose the one that suits your needs.
Using-pip
Run the following command to install the project using pip::
pip install hyperon-das
Using-Poetry
If you prefer to manage your Python projects with Poetry, follow these steps:
-
Install Poetry (if you haven't already):
pip install poetry
-
Clone the project repository:
git clone git@github.com:singnet/das-query-engine.git cd das-query-engine
-
Install project dependencies using Poetry:
poetry install
-
Activate the virtual environment created by Poetry:
poetry shell
Now you can run the project within the Poetry virtual environment.
Usage
So far we have two ways of making queries using the API. One that uses database persistence and another that doesn't. The way to create and execute the query is exactly the same, the only difference is when you need to instantiate the API class. Below you can see more details.
Redis and MongoDB
If you want to use data persistence, you must have Redis and MongoDB running in your environment and you must have the following variables configured with their respective values:
Example:
DAS_MONGODB_HOSTNAME=172.17.0.2
DAS_MONGODB_PORT=27017
DAS_MONGODB_USERNAME=mongo
DAS_MONGODB_PASSWORD=mongo
DAS_REDIS_HOSTNAME=127.0.0.1
DAS_REDIS_PORT=6379
TIP: You can change the values in the environment file, which is in the root directory and run the command below:
source environment
Create a client API
from hyperon_das import DistributedAtomSpace
api = DistributedAtomSpace('redis_mongo')
In Memory
This way you don't need anything just instantiate the class as shown below:
-
A simple query which is a
AND
operation on two links whose targets are variables.from hyperon_das import DistributedAtomSpace from hyperon_das.pattern_matcher import And, Variable, Link from hyperon_das.utils import QueryOutputFormat api = DistributedAtomSpace('ram_only') api.add_link({ 'type': 'Evaluation', 'targets': [ {'type': 'Predicate', 'name': 'Predicate:has_name'}, { 'type': 'Evaluation', 'targets': [ {'type': 'Predicate', 'name': 'Predicate:has_name'}, { 'type': 'Set', 'targets': [ {'type': 'Reactome', 'name': 'Reactome:R-HSA-164843'}, {'type': 'Concept', 'name': 'Concept:2-LTR circle formation'}, ] }, ], }, ], }) expression = Link("Evaluation", ordered=True, targets=[Variable("V1"), Variable("V2")]) resp = api.pattern_matcher_query(expression, {'return_type': QueryOutputFormat.JSON, 'toplevel_only': True}) print(resp)
[ { "V1": { "type": "Predicate", "name": "Predicate:has_name", "is_link": false, "is_node": true }, "V2": { "type": "Evaluation", "targets": [ { "type": "Predicate", "name": "Predicate:has_name" }, { "type": "Set", "targets": [ { "type": "Reactome", "name": "Reactome:R-HSA-164843" }, { "type": "Concept", "name": "Concept:2-LTR circle formation" } ] } ], "is_link": true, "is_node": false } } ]
-
Add Node and And Link (It's possible only using Ram Only)
api.count_atoms() # (0, 0) nodes = [ { 'type': 'Reactome', 'name': 'Reactome:R-HSA-164843', }, { 'type': 'Concept', 'name': 'Concept:2-LTR circle formation', } ] for node in nodes: api.add_node(node) api.count_atoms() # (2, 0) link = { 'type': 'Evaluation', 'targets': [ { 'type': 'Predicate', 'name': 'Predicate:has_name' }, { 'type': 'Evaluation', 'targets': [ { 'type': 'Predicate', 'name': 'Predicate:has_name' }, { 'type': 'Set', 'targets': [ { 'type': 'Reactome', 'name': 'Reactome:R-HSA-164843', }, { 'type': 'Concept', 'name': 'Concept:2-LTR circle formation', }, ] }, ], }, ], } api.add_link(link) api.count_atoms() # (3, 3)
Note1: in this example I add 2 nodes and 1 a link, but in the end I have 3 nodes and 3 links. Therefore, it is possible to add nested links and as links are composed of nodes, if the link node doesn't exist in the system it's added.
Note2: For these methods to work well, both nodes and links must be a dict with the structure shown above, i.e, for nodes you need to send, at least, the parameters
type
andname
and for linkstype
andtargets
Tests
You can run the command below to run the unit tests
make test-coverage
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file hyperon_das-0.3.11.tar.gz
.
File metadata
- Download URL: hyperon_das-0.3.11.tar.gz
- Upload date:
- Size: 21.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.1 CPython/3.12.0 Linux/6.2.0-1018-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2c0f6a210fcb664d284b0b7bca4c13c75fba93a8df34d0ce33b8131186409e92 |
|
MD5 | ade31ada679f36d8a7b474346af8910e |
|
BLAKE2b-256 | 96c4e1363df8d93a159410e488304a574ffef00e33685b2c5f2568f13c256ba3 |
File details
Details for the file hyperon_das-0.3.11-py3-none-any.whl
.
File metadata
- Download URL: hyperon_das-0.3.11-py3-none-any.whl
- Upload date:
- Size: 22.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.1 CPython/3.12.0 Linux/6.2.0-1018-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c5f8ef9188d6256af9d22676f202fbb57a63046532e38e3ec82fb8bd175f07e9 |
|
MD5 | 10ebdabee850c60c941866f5885e0299 |
|
BLAKE2b-256 | aff182abc64c65891cb4fd5004b3ae3096946c576ee57f936193e880f41d4072 |