Skip to main content

No project description provided

Project description

The Open-Source Slack AI App

Test Coverage Maintainability GitHub License Static Badge contributions welcome X (formerly Twitter) Follow

This repository is a ready-to-run basic Slack AI solution you can host yourself and unlock the ability to summarize threads and channels on demand using OpenAI (support for alternative and open source LLMs will be added if there's demand). The official Slack AI product looks great, but with limited access and add-on pricing, I decided to open-source the version I built in September 2023. Learn more about how and why I built an open-source Slack AI.

Once up and running (instructions for the whole process are provided below), all your Slack users will be able to generate to both public and private:

  1. Thread summaries - Generate a detailed summary of any Slack thread (powered by GPT-3.5-Turbo)
  2. Channel overviews - Generate an outline of the channel's purpose based on the extended message history (powered by an ensemble of NLP models and a little GPT-4 to explain the analysis in natural language)
  3. Channel summaries since - Generate a detailed summary of a channel's messages since a given point in time (powered by GPT-3.5-Turbo). Note: this doesn't include threads yet.
  4. Full channel summaries (beta) - Generate a detailed summary of a channel's extended history (powered by GPT-3.5-Turbo). Note: this can get very long!

Table of Contents

Getting Started

Follow these instructions to get a copy of the project up and running on your local machine for development and testing purposes.

Prerequisites

Ensure you have the following preconfigured or installed on your local development machine:

Installation

  1. Clone the repository to your local machine.
  2. Navigate to the project directory.
  3. Install the required Python packages using Poetry:
poetry install
  1. Install the dictionary model
poetry run python -m spacy download en_core_web_md
  1. Create a .env file in the root directory of the project, and fill it with your API keys and tokens. Use the example.env file as a template.
cp example.env .env && open .env

Slack app configuration

Make a copy of manifest.json and change the request URL to your ngrok or server URL.

Create a new Slack app here and configure it using your manifest.yaml file.

You shouldn't need to make any other changes but you can change the name, description, and other copy related settings.

If you wish to adjust the name of the slash commands, you'll need to modify slack_server.py.

Once configured, retrieve the "Bot User OAuth Token" from the "Install App" page and add it to your .env file as SLACK_BOT_TOKEN.

Then, on the Basic Information page under the App-Level Tokens heading create a token with the scop connections:write and add it to your .env file as SLACK_APP_TOKEN.

Usage

To run the application, run the FastAPI server:

poetry run uvicorn ossai.slack_server:app --reload

You'll then need to expose the server to the internet using ngrok.

Run ngrok with the following command: ngrok http 8000

Then add the ngrok URL to your Slack app's settings.

Customization

The main customization options are:

  • Channel Summary: customize the ChatGPT prompt in topic_analysis.py
  • Thread Summary: customize the ChatGPT prompt in summarizer.py

Testing

This project uses pytest and pytest-cov to run tests and measure test coverage.

Follow these steps to run the tests with coverage:

  1. Navigate to the project root directory.

  2. Run the following command to execute the tests with coverage:

    pytest --cov=ossai tests/
    

    This command will run all the tests in the tests/ directory and generate a coverage report for the ossai module.

  3. After running the tests, you will see a report in your terminal that shows the percentage of code covered by tests and highlights any lines that are not covered.

Please note that if you're using a virtual environment, make sure it's activated before running these commands.

Future Enhancements

  • Move to LangChain & LangSmith for extensibility, tracing, & control
  • leverage LangSmith's feedback capabilities to capture & learn from user feedback
  • Add a /tldr_since command to summarize a channel's messages since a given date
  • Add slack app setup details and sample app manifest to README
  • Incorporate threaded conversations in channel-level summaries
  • Implement evals suite to complement unit tests
  • Add support for alternative and open-source LLMs
  • Explore workflow for collecting data & fine-tuning models for cost reduction
  • Add support for anonymized message summaries
  • Leverage prompt tools like Chain of Destiny
  • Add support for pulling supporting context from external sources like company knowledge bases
  • Explore caching and other performance optimizations
  • Explore sentiment analysis

Contributing

I more than welcome contributions! Please read CONTRIBUTING.md for details on how to submit feedback, bugs, feature requests, enhancements, or your own pull requests.

License

This project is licensed under the GPL-3.0 License - see the LICENSE.md file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

open_source_slack_ai-0.3.2.tar.gz (29.5 kB view hashes)

Uploaded Source

Built Distribution

open_source_slack_ai-0.3.2-py3-none-any.whl (29.6 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page