Skip to main content

Prolog-like interpreter and tuple store

Project description

A lightweight Prolog-like interpreter with a natural-language style syntax and neuro-symbolic tuple database interface

We closely follow Einstein's "Everything should be made as simple as possible, but no simpler."

At this point, we rely on Python's natural error checking, without doing much to warn about syntactic or semantic errors. This can be added, but this is meant as an executable specification of an otherwise simple and natural logic language that we hereby name Natlog.

Natlog : a succinct overview

  • Terms are represented as nested tuples.

  • A parser and scanner for a simplified Prolog term syntax is used to turn terms into nested Python tuples.

Surface syntax of facts, as read from strings, is just whitespace separated words (with tuples parenthesized) and sentences ended with . or ?. Like in Prolog, variables are capitalized, unless quoted. Example programs are in folder natprogs, for instance tc.nat:

cat is feline.
tiger is feline.
mouse is rodent.
feline is mammal.
rodent is mammal.
snake is reptile.
mammal is animal.
reptile is animal.

tc A Rel B : A Rel B.
tc A Rel C : A Rel B, tc B Rel C.

To query it, try:

>>> n=natlog(file_name="natprogs/tc.nat")
>>> n.query("tc Who is animal ?")

It will return the transitive closure of the is relation.

GOAL PARSED: (('tc', 0, 'is', 'animal'),)
ANSWER: ('tc', 'cat', 'is', 'animal')
ANSWER: ('tc', 'tiger', 'is', 'animal')
ANSWER: ('tc', 'mouse', 'is', 'animal')
ANSWER: ('tc', 'feline', 'is', 'animal')
ANSWER: ('tc', 'rodent', 'is', 'animal')
ANSWER: ('tc', 'snake', 'is', 'animal')
ANSWER: ('tc', 'mammal', 'is', 'animal')
ANSWER: ('tc', 'reptile', 'is', 'animal')

List processing is also supported as in:

app () Ys Ys. 
app (X Xs) Ys (X Zs) : app Xs Ys Zs.

The interpreter supports a yield mechanism, similar to Python's own. Something like ^ my_answer X resulting in my_answer X to be yield as an answer.

The interpreter has also been extended to handle simple function and generator calls to Python using the same prefix operator syntax:

  • `f A B .. Z R, resulting in Python function f(A,B,C) being called and R unified with its result
  • ``f A B .. Z R, resulting in Python generator f(A,B,C) being called and R unified with its multiple yields, one a time
  • ~R A B .. Z for unifying ~ R A B .. Z with matching facts in the term store
  • # f A B .. Z, resulting in f(A,B,C,..,Z) being called with no result returned

A nested tuple store for unification-based tuple mining

An indexer in combination with the unification algorithm is used to retrieve ground terms matching terms containing logic variables.

Indexing is on all constants occurring in ground facts placed in a database.

As facts are ground, unification has occurs check and trailing turned off when searching for a match.

To try it out, do:

python3 -i tests.py

>>> dtest()

It gives, after digesting a text and then querying it:

   John has (a car).
   Mary has (a bike).
   Mary is (a student).
   John is (a pilot).
   
('John', 'has', ('a', 'car'))
('Mary', 'has', ('a', 'bike'))
('Mary', 'is', ('a', 'student'))
('John', 'is', ('a', 'pilot'))


Who has (a What)?
--> ('John', 'has', ('a', 'car'))
--> ('Mary', 'has', ('a', 'bike'))

Who is (a pilot)?
--> ('John', 'is', ('a', 'pilot'))

'Mary' is What?
--> ('Mary', 'is', ('a', 'student'))

'John' is (a What)?
--> ('John', 'is', ('a', 'pilot'))

Who is What?
--> ('Mary', 'is', ('a', 'student'))
--> ('John', 'is', ('a', 'pilot'))

Neuro-symbolic tuple database NOT ADDED YET TO THIS VERSION

As an extension to the nested tuple store the neuro-symbolic tuple database uses a machine learning algorithm instead of its indexer.Thus it offers the same interface as the tuple store that it extends. The learner is trained upon loading the database file (from a .nat, .csv or .tsv file) and its inference mechanism is triggered when facts from the database are queried. The stream of tuples returned from the query is then filtered via unification (and possibly, more general integrity constraints, expressed via logic programming constructs).

Example of usage (see more at https://github.com/ptarau/pypro/blob/master/tests.py )

def ndb_test() :
  nd = neural_natlog(file_name="natprogs/dbtc.nat",db_name="natprogs/db.nat")
  print('RULES')
  print(nd)
  print('DB FACTS')
  print(nd.db)
  nd.query("tc Who is_a animal ?")

The output will show the X and y numpy arrays used to fit the sklearn learner and then the logic program's rules and the facts from which the arrays were extracted when the facts were loaded.

X:
 [[1 0 0 0 0 0 0 0 0 0 0 0]
 [0 1 0 0 0 0 0 0 0 0 0 0]
 [0 0 1 0 0 0 0 0 0 0 0 0]
 [0 0 0 1 0 0 0 0 0 0 0 0]
 [0 0 0 0 1 0 0 0 0 0 0 0]
 [0 0 0 0 0 1 0 0 0 0 0 0]
 [0 0 0 0 0 0 1 0 0 0 0 0]
 [0 0 0 0 0 0 0 1 0 0 0 0]
 [0 0 0 0 0 0 0 0 1 0 0 0]
 [0 0 0 0 0 0 0 0 0 1 0 0]
 [0 0 0 0 0 0 0 0 0 0 1 0]
 [0 0 0 0 0 0 0 0 0 0 0 1]]

y:
 [[1 0 1 0 0 0 0 0 0 0]
 [1 1 1 1 1 1 1 1 1 1]
 [1 0 0 0 0 0 0 0 0 0]
 [0 1 0 1 0 0 0 0 0 0]
 [0 1 0 0 0 0 0 0 0 0]
 [0 0 1 1 0 1 0 0 0 0]
 [0 0 0 0 1 0 0 0 0 0]
 [0 0 0 0 1 0 1 0 0 0]
 [0 0 0 0 0 1 1 0 0 1]
 [0 0 0 0 0 0 0 1 1 1]
 [0 0 0 0 0 0 0 1 0 0]
 [0 0 0 0 0 0 0 0 1 0]] 

RULES
(('cat', 'is_a', 'feline'), ())
((0, 'is_a', 1), (('~', 0, 'is', 1),))
(('tc', 0, 1, 2), ((0, 1, 3), ('tc1', 3, 1, 2)))
(('tc1', 0, 1, 0), ())
(('tc1', 0, 1, 2), (('tc', 0, 1, 2),))

DB FACTS
(0, ('tiger', 'is', 'feline'))
(1, ('mouse', 'is', 'rodent'))
(2, ('feline', 'is', 'mammal'))
(3, ('rodent', 'is', 'mammal'))
(4, ('snake', 'is', 'reptile'))
(5, ('mammal', 'is', 'animal'))
(6, ('reptile', 'is', 'animal'))
(7, ('bee', 'is', 'insect'))
(8, ('ant', 'is', 'insect'))
(9, ('insect', 'is', 'animal'))

GOAL PARSED: (('tc', 0, 'is_a', 'animal'),)
ANSWER: ('tc', 'cat', 'is_a', 'animal')
ANSWER: ('tc', 'tiger', 'is_a', 'animal')
ANSWER: ('tc', 'mouse', 'is_a', 'animal')
ANSWER: ('tc', 'feline', 'is_a', 'animal')
ANSWER: ('tc', 'rodent', 'is_a', 'animal')
ANSWER: ('tc', 'snake', 'is_a', 'animal')
ANSWER: ('tc', 'mammal', 'is_a', 'animal')
ANSWER: ('tc', 'reptile', 'is_a', 'animal')
ANSWER: ('tc', 'bee', 'is_a', 'animal')
ANSWER: ('tc', 'ant', 'is_a', 'animal')
ANSWER: ('tc', 'insect', 'is_a', 'animal')

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Natlog-1.0.0.tar.gz (52.6 kB view details)

Uploaded Source

Built Distribution

Natlog-1.0.0-py3-none-any.whl (18.9 kB view details)

Uploaded Python 3

File details

Details for the file Natlog-1.0.0.tar.gz.

File metadata

  • Download URL: Natlog-1.0.0.tar.gz
  • Upload date:
  • Size: 52.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.9.9

File hashes

Hashes for Natlog-1.0.0.tar.gz
Algorithm Hash digest
SHA256 b0e137ffb42bb2d18b78c3977907b9fad653fa6fea7b66e45a6c256beb8055f6
MD5 35a15df71e5c4eead4f7c803749cdbe3
BLAKE2b-256 993fe114a76c6a969131f712b035768c329c57c078c5d90fb23e7d435f960a49

See more details on using hashes here.

File details

Details for the file Natlog-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: Natlog-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 18.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.9.9

File hashes

Hashes for Natlog-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c44579f4ebf4b9894de37dac1357b37d848bb9ab518b114bb6232f7a0b7dae6a
MD5 b317967acafc5f73c29fc177f7b698fb
BLAKE2b-256 22ea7b70ae8972bad9f55dc0d087f43cb908c8f9070e589f9ad6798a5dd43208

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page