Skip to main content

The Dialog Flow Framework (DFF) allows you to write conversational services.

Project description

Dialog Flow Framework

The Dialog Flow Framework (DFF) allows you to write conversational services. The service is written by defining a special dialog graph that describes the behavior of the dialog service. The dialog graph contains the dialog script. DFF offers a specialized language (DSL) for quickly writing dialog graphs. You can use it in services such as writing skills for Amazon Alexa, etc., chatbots for social networks, website call centers, etc.

Documentation Status Codestyle Tests License Apache 2.0 Python 3.8, 3.9, 3.10, 3.11 PyPI Downloads

Quick Start

Installation

DFF can be installed via pip:

pip install dff

The above command will set the minimum dependencies to start working with DFF. The installation process allows the user to choose from different packages based on their dependencies, which are:

pip install dff[core]  # minimal dependencies (by default)
pip install dff[json]  # dependencies for using JSON
pip install dff[pickle] # dependencies for using Pickle
pip install dff[redis]  # dependencies for using Redis
pip install dff[mongodb]  # dependencies for using MongoDB
pip install dff[mysql]  # dependencies for using MySQL
pip install dff[postgresql]  # dependencies for using PostgreSQL
pip install dff[sqlite]  # dependencies for using SQLite
pip install dff[ydb]  # dependencies for using Yandex Database
pip install dff[telegram]  # dependencies for using Telegram
pip install dff[benchmark]  # dependencies for benchmarking
pip install dff[full]  # full dependencies including all options above
pip install dff[tests]  # dependencies for running tests
pip install dff[test_full]  # full dependencies for running all tests (all options above)
pip install dff[tutorials]  # dependencies for running tutorials (all options above)
pip install dff[devel]  # dependencies for development
pip install dff[doc]  # dependencies for documentation
pip install dff[devel_full]  # full dependencies for development (all options above)

For example, if you are going to use one of the database backends, you can specify the corresponding requirements yourself. Multiple dependencies can be installed at once, e.g.

pip install dff[postgresql, mysql]

Basic example

from dff.script import GLOBAL, TRANSITIONS, RESPONSE, Context, Message
from dff.pipeline import Pipeline
import dff.script.conditions.std_conditions as cnd
from typing import Tuple

# create a dialog script
script = {
GLOBAL: {
TRANSITIONS: {
("flow", "node_hi"): cnd.exact_match(Message(text="Hi")),
("flow", "node_ok"): cnd.true()
}
},
"flow": {
"node_hi": {RESPONSE: Message(text="Hi!!!")},
"node_ok": {RESPONSE: Message(text="Okey")},
},
}

# init pipeline
pipeline = Pipeline.from_script(script, start_label=("flow", "node_hi"))


# handler requests
def turn_handler(in_request: Message, pipeline: Pipeline) -> Tuple[Message, Context]:
# Pass the next request of user into pipeline and it returns updated context with actor response
ctx = pipeline(in_request, 0)
# Get last actor response from the context
out_response = ctx.last_response
# The next condition branching needs for testing
return out_response, ctx


while True:
in_request = input("type your answer: ")
out_response, ctx = turn_handler(Message(text=in_request), pipeline)
print(out_response.text)

When you run this code, you get similar output:

type your answer: hi
Okey
type your answer: Hi
Hi!!!
type your answer: ok
Okey
type your answer: ok
Okey

To get more advanced examples, take a look at tutorials on GitHub.

Context Storages

Description

Context Storages allow you to save and retrieve user dialogue states (in the form of a Context object) using various database backends.

The following backends are currently supported:

Aside from this, we offer some interfaces for saving data to your local file system. These are not meant to be used in production, but can be helpful for prototyping your application.

Basic example

from dff.script import Context
from dff.pipeline import Pipeline
from dff.context_storages import SQLContextStorage
from .script import some_df_script

db = SQLContextStorage("postgresql+asyncpg://user:password@host:port/dbname")

pipeline = Pipeline.from_script(some_df_script, start_label=("root", "start"), fallback_label=("root", "fallback"))


def handle_request(request):
user_id = request.args["user_id"]
new_context = pipeline(request, user_id)
return new_context.last_response

To get more advanced examples, take a look at tutorials on GitHub.

Contributing to the Dialog Flow Framework

Please refer to CONTRIBUTING.md.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dff-0.6.2.tar.gz (108.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dff-0.6.2-py3-none-any.whl (178.9 kB view details)

Uploaded Python 3

File details

Details for the file dff-0.6.2.tar.gz.

File metadata

  • Download URL: dff-0.6.2.tar.gz
  • Upload date:
  • Size: 108.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.12

File hashes

Hashes for dff-0.6.2.tar.gz
Algorithm Hash digest
SHA256 1ed2baf7633a40d374f9954b0ab2296f7d94efc25e89abf047a5f2eecb18a32f
MD5 c5e457aaf2ee72b1779fe71f6ae5d238
BLAKE2b-256 df7b4c9d71b4f97c9eb611c0413f92a57148ad4b4fddc0b715a96b8dc006ace6

See more details on using hashes here.

File details

Details for the file dff-0.6.2-py3-none-any.whl.

File metadata

  • Download URL: dff-0.6.2-py3-none-any.whl
  • Upload date:
  • Size: 178.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.12

File hashes

Hashes for dff-0.6.2-py3-none-any.whl
Algorithm Hash digest
SHA256 142b5f6941e1ca81f33ce2418502bc8e3b4fd163277912b8bdeb5d732a355bee
MD5 b33d6287e961faf57b323853ab903055
BLAKE2b-256 aadfae97c36bcc860d26b5b05db27155adf8696f08c903a38d2a4520e7005e42

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page