Skip to main content

This project calculates the satisfaction score for BLiP chatbots.

Project description

TakeSatisfaction

Gabriel Salgado, Juliana Guamá, Milo Utsch e Rogers Damas

Overview

Here is presented these content:

Intro

This project proposes to offer a rate that represents the customer satisfaction research from the bot.

The proposal converts the customer satifaction rate (CSR) to a normalized rate between 0 to 1. The normalized value alow the comparasion of CSR from differents bot that have differents scales.

So far there is two pipelines: Bot Event Flow Pipeline and CSR normalization

Bot Event Flow Pipeline

The first step of bot flow pipeline is collect event bot data from Spark database. Bot data came from eventtracks table that have no defined pattern. However, this project recorver data only from identified patterns.

Therefore, the bot event flow pipeline do not cover all the possible CSR methods of saving data.

CSR normalization

We receive the CSR scale and then:

  1. do some light text pre processing like str.lower
  2. compare the given scale with the know scales using fuzzywuzzy.ratio
  3. if the scale is know, then convert and normalize, otherwise return the scale is not existent

~As long as we progress on this project, this description will include more details.

Configure

Here are shown recommended practices to configure project on local.

Virtual environment

This step can be done with commands or on PyCharm.

On commands

It is recommended to use virtual environment. To Create a virtual environment:

python -m venv venv

Enter to virtual environment (Windows):

./venv/Scripts/activate

Enter to virtual environment (Linux):

./venv/bin/activate

To exit virtual environment:

deactivate

On PyCharm

Open File/Settings... or press Ctrl+Alt+S. This opens settings window.

Open Project: SatisfactionRate/Project Interpreter on left menu.

Open Project Interpreter combobox and click on Show All.... This opens a window with Python interpreters.

Click on + or press Alt+Insert. This opens a window to create a new Python interpreter.

We will choose default options that create a new virtual environment into project. Click on Ok button.

Click on Ok button again. And again.

Configuring on PyCharm

If you are using PyCharm its better show PyCharm where is source code on project. Right click on src folder in Project window at left side. This opens context menu.

Choose Mark Directory as/Sources Root option. This marks src as source root directory. It will appears as blue folder on Project navigator.

Install

The take_satisfaction package can be installed from PyPI:

pip install take_satisfaction

Or from setup.py, located at src folder:

cd src
pip install . -U
cd ..

Installing take_satisfaction also installs all required libraries. But we can intended to only install dependencies or maybe update our environment if requirements changed.

All dependencies are declared in src/requirements.txt. Install dependencies can be done on command or on PyCharm.

On command

To install dependencies on environment, run:

python commands.py install

On PyCharm

After you created virtual environment or on open PyCharm, it will ask if you want to install requirements. Choose Install.

Test

You can test on commands or on PyCharm. It is being build.

On commands

First enter to virtual environment. Then run kedro tests:

python commands.py test

When this feature is built: See coverage results at htmlcov/index.html.

On PyCharm

Click on Edit Configurations... beside Run icon. This opens Run/Debug Configurations window.

Click on + or press Alt+Insert.

Choose Python tests/pytest option.

Fill Target field with path to tests folder as <path to project>/src/tests.

Click on Ok button.

Click on Run icon. This run the tests.

Open Terminal window and run command to generate HTML report:

coverage html

See coverage results at htmlcov/index.html.

Package

First enter to virtual environment. To package this project into .egg and .whell:

python commands.py package

Generated packages will be in folder src/dist. Each new package, do not forget to increase version at src/take_satisfaction/__init__.py

Upload

To upload build package to PyPI:

python commands.py upload

This upload the latest build version. After, package can be downloaded and installed by pip in any place with python and pip:

pip install take_satisfaction

Notebooks

Packaging this project is intended to be installed on a specific Databricks cluster. This is the cluster where we work with ML experiments using mlflow. And an experiment is done as example notebooks on shared, that is like:

import mlflow as ml
import take_satisfaction as tr
with ml.start_run():
    # experiment code using our pipelines
    params = {}
    ml.log_params(params)
    # other logs from results

Tips

In order to maintain the project:

  • Do not remove or change any lines from the .gitignore unless you know what are you doing.
  • When developing experiments and production, follow data standard related to suitable layers.
  • When developing experiments, put them into notebooks, following code policies.
  • Write notebooks on Databricks and synchronize it to this repository into particular sub-folder in folder notebooks and commit them.
  • Do not commit any data.
  • Do not commit any log file.
  • Do not commit any credentials or local configuration.
  • Keep all credentials or local configuration in folder conf/local/.
  • Do not commit any generated file on testing or building processes.
  • Run test before pull request to make sure that has no bug.
  • Follow git flow practices:
    • Create feature branch for new feature from dev branch. Work on this branch with commits and pushes. Send a pull request to dev branch when terminate the work.
    • When terminate a set of features to release, merge dev branch to test branch. Apply several and strict tests to be sure that all are fine. On find errors, fix all and apply tests again. When all are ok, merge from test to master increasing release version and uploading to PyPI.
    • If some bug is found on production, master branch, create hotfix branch from master. Correct all errors and apply tests like in test branch. When all are ok, merge from hotfix branch to master and then, merge from master to dev.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

take_satisfaction-0.3.1-py3.7.egg (29.8 kB view hashes)

Uploaded Source

take_satisfaction-0.3.1-py3-none-any.whl (14.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page