Skip to main content

Deep, recursive, goal-driven LLM explorer

Project description

deepllm: Full Automation of Goal-driven LLM Dialog Threads with And-Or Recursors and Refiner Oracles

recursors

Overview

We automate deep step-by step reasoning in an LLM dialog thread by recursively exploring alternatives (OR-nodes) and expanding details (AND-nodes) up to a given depth. Starting from a single succinct task-specific initiator we steer the automated dialog thread to stay focussed on the task by synthesizing a prompt that summarizes the depth-first steps taken so far.

Our algorithm is derived from a simple recursive descent implementation of a Horn Clause interpreter, except that we accommodate our logic engine to fit the natural language reasoning patterns LLMs have been trained on. Semantic similarity to ground-truth facts or oracle advice from another LLM instance is used to restrict the search space and validate the traces of justification steps returned as answers. At the end, the unique minimal model of a generated Horn Clause program collects the results of the reasoning process.

As applications, we sketch implementations of consequence predictions, causal explanations, recommendation systems and topic-focussed exploration of scientific literature.

INSTALLATION and USAGE:

First, you will need to acquire your OpenAI key from here.

NEW: with it, you are ready to try out at: https://deepllm.streamlit.app/

NEW: an intro on how to use the app and the API is now on Youtube

To run the code locally, put the OpenAI key in your Linux or OS X shell environment with:

export OPENAI_API_KEY=<your_key>

Downloading

Clone from github with with:

git clone git@github.com:ptarau/recursors.git

Installing

If you have cloned this repo, you can install the package deepllm by typing in folder recursors

pip3 install -e .

You can also install it from pypi with

pip3 install deepllm

API

The DeepLLM API exposes its high-level functions ready to embed in your application with something as simple as (assuming the your OPENAI_KEY is exported by your environment):

for result in run_recursor(initiator='Using tactical nukes', prompter=conseq_prompter, lim=2):
    print(result)

Also, you can explore questions with less gruesome results like in:

for result in run_rater(initiator='Artificial General Intelligence', prompter=sci_prompter, lim=2, threshold=0.5):
    print(result)

Tests and demos

Streamlit web app

After installing streamlit, try it with:

streamlit run deepllm/apps/app.py

Paper describing this work

If you find this software useful please cite it as:

@ARTICLE{tarau2023automation,
       author = {{Tarau}, Paul},
        title = "{Full Automation of Goal-driven LLM Dialog Threads with And-Or Recursors and Refiner Oracles}",
      journal = {arXiv e-prints},
     keywords = {Computer Science - Artificial Intelligence, Computer Science - Logic in Computer Science},
         year = 2023,
        month = jun,
          eid = {arXiv:2306.14077},
        pages = {arXiv:2306.14077},
          doi = {10.48550/arXiv.2306.14077},
archivePrefix = {arXiv},
       eprint = {2306.14077},
 primaryClass = {cs.AI},
       adsurl = {https://ui.adsabs.harvard.edu/abs/2023arXiv230614077T},
      adsnote = {Provided by the SAO/NASA Astrophysics Data System}
}

You can also find the paper (and future related work) in folder docs.

Enjoy,

Paul Tarau

Project details


Release history Release notifications | RSS feed

This version

1.8.9

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deepllm-1.8.9.tar.gz (35.4 kB view details)

Uploaded Source

File details

Details for the file deepllm-1.8.9.tar.gz.

File metadata

  • Download URL: deepllm-1.8.9.tar.gz
  • Upload date:
  • Size: 35.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.7

File hashes

Hashes for deepllm-1.8.9.tar.gz
Algorithm Hash digest
SHA256 7b897ab7a58480b9610584916dcbea2ba06dbebce2eb8060fb53965824326ea7
MD5 c9c34bc3b3ed9bb238a0e3bade0121ff
BLAKE2b-256 b83538a95f1e6bd2ebcd5acfb5c82cef97f94cbad42cded5cca7b9cbce640ea7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page