Skip to main content

PyNeuraLogic lets you use Python to create Differentiable Logic Programs.

Project description

PyNeuraLogic

PyPI version License Tests Status Code Quality Status Documentation Status Tweet

Documentation · Examples · Papers · Report Bug · Request Feature

PyNeuraLogic lets you use Python to write Differentiable Logic Programs


About

Logic programming is a declarative coding paradigm in which you declare your logical variables and relations between them. These can be further composed into so-called rules that drive the computation. Such a rule set then forms a logic program, and its execution is equivalent to performing logic inference with the rules.

PyNeuralogic, through its NeuraLogic backend, then makes this inference process differentiable which, in turn, makes it equivalent to forward propagation in deep learning. This lets you learn numeric parameters that can be associated with the rules, just like you learn weights in neural networks.

SQL tutorial

What is this good for?

Many things! For instance - ever heard of Graph Neural Networks (GNNs)? Well, a graph happens to be a special case of a logical relation - a binary one to be more exact. Now, at the heart of any GNN model there is a so-called propagation rule for passing 'messages' between the neighboring nodes. Particularly, the representation ('message') of a node X is calculated by aggregating the previous representations of adjacent nodes Y, i.e. those with an edge between X and Y.

Or, a bit more 'formally':

R.msg2(Var.X) <= (R.msg1(V.Y), R.edge(V.Y, V.X))

...and that's the actual code! Now for a classic learnable GNN layer, you'll want to add some weights, such as

R.msg2(Var.X)[5,10] <= (R.msg1(V.Y)[10,20], R.edge(V.Y, V.X))

to project your [20,1] input node embeddings ('message1') through a learnable [10,20] layer before the aggregation, and subsequently a [5,10] layer after the aggregation.

If you don't like the default settings, you can of course specify various additional details, such as the particular aggregation and activation functions

(R.msg2(V.X)[5,10] <= (R.msg1(V.Y)[10,20], R.edge(V.Y, V.X))) | [Transformation.RELU, Aggregation.AVG]

to instantiate the classic GCN layer specification, which you can directly train now!

graph TD;
    edge10[/"edge(1, 0)"\]-->RuleNeuron1("msg2(0) <= msg1(1), edge(1, 0).");
    msg1[/"msg1(1)"\]-- w_1 -->RuleNeuron1;

    edge00[/"edge(0, 0)"\]-->RuleNeuron2("msg2(0) <= msg1(0), edge(0, 0).");
    msg0[/"msg1(0)"\]-- w_1 -->RuleNeuron2;

    edge30[/"edge(3, 0)"\]-->RuleNeuron3("msg2(0) <= msg1(3), edge(3, 0).");
    msg3[/"msg1(3)"\]-- w_1 -->RuleNeuron3;

    RuleNeuron1-- ReLU -->AggregationNeuron[["Rules Aggregation (Average)"]]
    RuleNeuron2-- ReLU -->AggregationNeuron[["Rules Aggregation (Average)"]]
    RuleNeuron3-- ReLU -->AggregationNeuron[["Rules Aggregation (Average)"]]

    AggregationNeuron-- w_2 -->OutputNeuron[\"Output Neuron (Tanh)"/]

How is it different from other GNN frameworks?

Naturally, PyNeuralogic is by no means limited to GNN models, as the expressiveness of relational logic goes much further beyond graphs. Hence, nothing stops you from playing directly with:

  • multiple relations and object types
  • hypergraphs, nested graphs, relational databases
  • relational pattern matching, various subgraph GNNs
  • alternative propagation schemes
  • inclusion of logical background knowledge
  • and more...

In PyNeuraLogic, all these ideas take the same form of simple small logic programs. These are commonly highly transparent and easy to understand, thanks to their declarative nature. Consequently, there is no need to design a zoo of blackbox class names for each small modification of the GNN rule - you code directly at the level of the logical principles here!

The backend engine then creates the underlying differentiable computation (inference) graphs in a fully automated and dynamic fashion, hence you don't have to care about aligning everything into some static (tensor) operations.

How does it perform?

While PyNeuraLogic allows you to easily declare highly expressive models with capabilities far beyond the common GNNs, it does not come at the cost of performance for the basic GNNs either. On the contrary, for a range of common GNN models and applications, such as learning with molecules, PyNeuraLogic is actually considerably faster than the popular GNN frameworks, as demonstrated in our benchmarks.

Benchmark of PyNeuraLogic


We hope you'll find the framework useful in designing your own deep relational learning ideas beyond the GNNs! Please let us know if you need some guidance or would like to cooperate!

🚀 Getting started

Installation

To install PyNeuraLogic's latest release from the PyPI repository, use the following command:

$ pip install neuralogic

Prerequisites

To use PyNeuraLogic, you need to install the following prerequisites:

Python >= 3.8
Java >= 1.8

[!TIP]

In case you want to use visualization provided in the library, it is required to have Graphviz installed.


📦 Predefined Modules

PyNeuraLogic has a set of predefined modules to get you quickly started with your experimenting! It contains, for example, predefined modules for:

  • Graph Neural Networks (GCNConv, SAGEConv, GINConv, RGCNConv, ...)
  • Meta graphs and meta paths (MetaConv, MAGNN, ...)
  • Transformer, LSTM, GRU, RNN, ...and more!

🔬 Examples

Open In Colab Simple XOR example
Open In Colab Molecular GNNs
Open In Colab Recursive XOR generalization
Open In Colab Visualization

Open In Colab Subgraph Patterns
Open In Colab Distinguishing k-regular graphs
Open In Colab Distinguishing non-regular graphs

📝 Papers

📘 Articles

🎥 Videos

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

neuralogic-0.8.0.tar.gz (2.0 MB view details)

Uploaded Source

Built Distribution

neuralogic-0.8.0-py3-none-any.whl (2.0 MB view details)

Uploaded Python 3

File details

Details for the file neuralogic-0.8.0.tar.gz.

File metadata

  • Download URL: neuralogic-0.8.0.tar.gz
  • Upload date:
  • Size: 2.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for neuralogic-0.8.0.tar.gz
Algorithm Hash digest
SHA256 09973b0a90aec4e5a277884979684961f625c99e4927213cb60ef5c8a980ed78
MD5 2bee8957ef1bc5a65bc345a405f3045a
BLAKE2b-256 60e0d7a9b6ac563e337dd69a9861fae90dd64f2d3499fb50d0c0fadc31325713

See more details on using hashes here.

File details

Details for the file neuralogic-0.8.0-py3-none-any.whl.

File metadata

  • Download URL: neuralogic-0.8.0-py3-none-any.whl
  • Upload date:
  • Size: 2.0 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for neuralogic-0.8.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f6178b24d2a83181ebcaddfa10d827ded5d6e3153664aca8741fcda176bfc1c8
MD5 4489dce8b4535165e3d563bc99c74a60
BLAKE2b-256 5931362aa488303bf1d7671a02c230f451e26b7cf16ec943c179be0742d098e4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page