Skip to main content

A Python library and set of command line utilities for exchanging Knowledge Graphs (KGs) that conform to or are aligned to the Biolink Model.

Project description

Knowledge Graph Exchange

Python Run testsDocumentation Status Quality Gate Status Maintainability Rating Coverage PyPI Docker

KGX (Knowledge Graph Exchange) is a Python library and set of command line utilities for exchanging Knowledge Graphs (KGs) that conform to or are aligned to the Biolink Model.

The core datamodel is a Property Graph (PG), represented internally in Python using a networkx MultiDiGraph model.

KGX allows conversion to and from:

KGX will also provide validation, to ensure the KGs are conformant to the Biolink Model: making sure nodes are categorized using Biolink classes, edges are labeled using valid Biolink relationship types, and valid properties are used.

Internal representation is a property graph, specifically a networkx MultiDiGraph.

The structure of this graph is expected to conform to the Biolink Model standard, as specified in the KGX format specification.

In addition to the main code-base, KGX also provides a series of command line operations.

Error Detection and Reporting

Non-redundant JSON-formatted structured error logging is now provided in KGX Transformer, Validator, GraphSummary and MetaKnowledgeGraph operations. See the various unit tests for the general design pattern (using the Validator as an example here):

from kgx.validator import Validator
from kgx.transformer import Transformer

Validator.set_biolink_model("2.11.0")

# Validator assumes the currently set Biolink Release
validator = Validator()

transformer = Transformer(stream=True)

transformer.transform(
    input_args = {
        "filename": [
            "graph_nodes.tsv",
            "graph_edges.tsv",
        ],
        "format": "tsv",
    },
    output_args={
        "format": "null"
    },
    inspector=validator,
)

# Both the Validator and the Transformer can independently capture errors

# The Validator, from the overall semantics of the graph...
# Here, we just report severe Errors from the Validator (no Warnings)
validator.write_report(open("validation_errors.json", "w"), "Error")

# The Transformer, from the syntax of the input files... 
# Here, we catch *all* Errors and Warnings (by not providing a filter)
transformer.write_report(open("input_errors.json", "w"))

The JSON error outputs will look something like this:

{
    "ERROR": {
        "MISSING_EDGE_PROPERTY": {
            "Required edge property 'id' is missing": [
                "A:123->X:1",
                "B:456->Y:2"
            ],
            "Required edge property 'object' is missing": [
                "A:123->X:1"
            ],
            "Required edge property 'predicate' is missing": [
                "A:123->X:1"
            ],
            "Required edge property 'subject' is missing": [
                "A:123->X:1",
                "B:456->Y:2"
            ]
        }
    },
    "WARNING": {
        "DUPLICATE_NODE": {
          "Node 'id' duplicated in input data": [
            "MONDO:0010011",
            "REACT:R-HSA-5635838"
          ]
        }
    }
}

This system reduces the significant redundancies of earlier line-oriented KGX logging text output files, in that graph entities with the same class of error are simply aggregated in lists of names/identifiers at the leaf level of the JSON structure.

The top level JSON tags originate from the MessageLevel class and the second level tags from the ErrorType class in the error_detection module, while the third level messages are hard coded as log_error method messages in the code.

It is likely that additional error conditions within KGX can be efficiently captured and reported in the future using this general framework.

Installation

Installing from PyPI

KGX is available on PyPI and can be installed using pip as follows,

pip install kgx

To install a particular version of KGX, be sure to specify the version number,

pip install kgx==0.5.0

Installing from GitHub

Clone the GitHub repository and then install,

git clone https://github.com/biolink/kgx
cd kgx
poetry install

Setting up a testing environment for Neo4j

This release of KGX supports graph source and sink transactions with the 4.3 release of Neo4j.

KGX has a suite of tests that rely on Docker containers to run Neo4j specific tests.

To set up the required containers, first install Docker on your local machine.

Once Docker is up and running, run the following commands:

docker run -d --rm --name kgx-neo4j-integration-test -p 7474:7474 -p 7687:7687 --env NEO4J_AUTH=neo4j/test neo4j:4.3
docker run -d --rm --name kgx-neo4j-unit-test -p 8484:7474 -p 8888:7687 --env NEO4J_AUTH=neo4j/test neo4j:4.3

Note: Setting up the Neo4j container is optional. If there is no container set up then the tests that rely on them are skipped.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kgx-2.0.4.tar.gz (94.7 kB view details)

Uploaded Source

Built Distribution

kgx-2.0.4-py3-none-any.whl (117.6 kB view details)

Uploaded Python 3

File details

Details for the file kgx-2.0.4.tar.gz.

File metadata

  • Download URL: kgx-2.0.4.tar.gz
  • Upload date:
  • Size: 94.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.16

File hashes

Hashes for kgx-2.0.4.tar.gz
Algorithm Hash digest
SHA256 ff596400f1dcae9e3d4fb68302e8eb96faac4fff9606488481420f06fa1f335f
MD5 8179f9fc35559df9388c22756a9fccd9
BLAKE2b-256 60c3297f52917d58542fac88426101a6dc073d51eb4ff5bc91bf4f6f4f0958fc

See more details on using hashes here.

File details

Details for the file kgx-2.0.4-py3-none-any.whl.

File metadata

  • Download URL: kgx-2.0.4-py3-none-any.whl
  • Upload date:
  • Size: 117.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.16

File hashes

Hashes for kgx-2.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 42a85f802266e70db2d4ebc35f87fd60f03c6172aaf2373f271ccaa47d3ee994
MD5 8f1e12e6b9c3a3712b93e6359a61f00d
BLAKE2b-256 bf437a41304f9926164406c31b1621c643fa42815eb9f4b66813e229fb10f170

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page