Skip to main content

A library to store data in a knowledge graph

Project description

Setup

Install knowledge-graph using these commands:

git clone https://github.com/hyroai/knowledge-graph.git
pip install -e ./knowledge-graph
pip install cloud-utils@https://github.com/hyroai/cloud-utils/tarball/master

Publish to PyPI

After merging your new branch to the main branch, You need to publish a new version to PyPI. On the repo's main view go to the "Code" tab on top and then press on releases. Click "Draft a new release" and create you new release with semantic version that follows the rules here: https://semver.org/. Publishing your new release will create a new version on PyPI using the latest commit on main.

When to use querying.py vs querying_raw.py

Both modules provide querying abilities, but querying.py relies on a global store of kgs, storred by their hash value. The code retrieves the kg instance from this map ad hoc. This was built so we can have a serializable representation of a node in the kg, Node, which has graph_id and node_id, and not have to serialize the entire kg any time we serialize an object containing them (e.g. if we're serializing or hashing a NounPhrase for some reason).

The global store is an impure component that can make some code have different behaviours depending on what happened before (whether or not the kg was loaded).

Consequently, in contexts where you have a Element and KnowledgeGraph, prefer using querying_raw.py, giving the graph instance explicitly. This is mainly when creating graphs or enriching them.

In contexts where all you have is Node, use querying.py.

If you implement new querying functions, you can implement them in querying_raw.py, and lift them to querying.py using storage.run_on_kg_and_node.

Do not use the internal modules inside the knowledge_graph directory

Instead, rely on the imports in knowledge_graph/__init__.py, possibly add what you need there (in rare cases this is required).

Relations naming conventions

Relation are the middle part of the triplet. Although they can safely be named anything, we use a naming convention that serves as mnemonic device. For example the relation person/gender implies the left hand side is an entity representing a person, whereas the right hand side is a gender, so we can expect a triplet like Alice,person/gender,female.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

knowledge_graph-0.0.33.tar.gz (16.9 kB view details)

Uploaded Source

Built Distribution

knowledge_graph-0.0.33-py3-none-any.whl (19.9 kB view details)

Uploaded Python 3

File details

Details for the file knowledge_graph-0.0.33.tar.gz.

File metadata

  • Download URL: knowledge_graph-0.0.33.tar.gz
  • Upload date:
  • Size: 16.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.3

File hashes

Hashes for knowledge_graph-0.0.33.tar.gz
Algorithm Hash digest
SHA256 57efdfed106d84fb7b6d2e45c45cc50cfeb3d002143d596e3129ab59fe7d9a71
MD5 51a3612ff8abece1302a973bfec1bbba
BLAKE2b-256 e745f504bb2412c1d0c65eb77839c2220596d02ee39e4be095592a1c157c654a

See more details on using hashes here.

File details

Details for the file knowledge_graph-0.0.33-py3-none-any.whl.

File metadata

File hashes

Hashes for knowledge_graph-0.0.33-py3-none-any.whl
Algorithm Hash digest
SHA256 741211cf3631bbde7d58a9a87c534fac5d7015a5c58c1f1ec01d01d8229811af
MD5 72a8464a1a7a2838d7e6c1f2a8aae7f6
BLAKE2b-256 132e0d3ab644ec4039e931f03e9a8d21377240d7b838b983d81f1890505a53d6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page