Skip to main content

An extension of the ralf toolkit with convenient primitives for building LLM-based dialogue agents.

Project description

ralf-dialogue

Documentation PyPI version License

ralf-dialogue is a Python framework for quickly prototyping dialogue agents that leverage large language models (LLMs), like ChatGPT or GPT-4. It lives in the broader RALF ecosystem, and assumes the use of the ralf library, which provides primitives, constructs, and other utilities for building complex systems around large language models using a desingn paradigm that some parts of the community have begun to call composability.

A word on the broader RALF ecosystem

ralf is a Python library intended to assist developers in creating applications that involve calls to Large Language Models (LLMs). A core concept in ralf is the idea of composability, which allows chaining together LLM calls such that the output of one call can be used to form the prompt of another. ralf makes it easy to chain together both LLM-based and Python-based actions— enabling developers to construct complex information processing pipelines composed of simpler building blocks. Using LLMs in this way can lead to more capable, robust, steerable and inspectable applications.

A general framework for building applications -- including conversational AIs -- that rely on a mix of prompted large language models (LLMs) and conventional Python code and services. This can be considered a powerful form of neuro-symbolic AI. Other frameworks -- such as this one, ralf-dialogue -- build on the core ralf library, adding primatives and components for building conversational agents (or "chatbots") that can interact with users in natural language, leveraging context and accessing external knowledge stores or reasoning engines. Many of these components are still under active construction, and we're always looking for talented contributors.

Getting started

Create a new Conda environment (or equivalent) containing Python 3.9 and switch to it:

conda create -n [ralf-env-name] python=3.9
conda activate [ralf-env-name]

Make sure you have the RALF library cloned and installed:

git clone git@github.com:jhuapl-fomo/ralf.git
cd ralf
pip install -r requirements.txt
flit build
pip install -e .

Next, make sure you have this repo (the RALF Dialogue repo) cloned into a local directory, and then navigate into it and repeat the same steps you followed above:

git clone git@github.com:jhuapl-fomo/ralf-dialogue.git
cd ralf-dialogue
pip install -r requirements.txt
flit build
pip install -e .

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ralf-dialogue-jhuapl-0.0.3.tar.gz (12.2 kB view hashes)

Uploaded Source

Built Distribution

ralf_dialogue_jhuapl-0.0.3-py3-none-any.whl (12.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page