Skip to main content

Driving Dynamic Multi-Participant Chat Interactions for AI and Human Discourse

Project description

ChatFlock

Build status Python Version Dependencies Status

Code style: black Security: bandit Pre-commit Semantic Versions License Coverage Report

Driving Dynamic Multi-Participant Chat Interactions for AI and Human Discourse 🤖

📦 Installation

pip install -U chat-flock

or install with Poetry

poetry add chat-flock

🤔 What is this?

ChatFlock is a Python library that revolutionizes the way multi-participant chats are conducted by integrating Large Language Models (LLMs) at its core. Born from first principles design, it not only simplifies orchestrating complex chat scenarios but also introduces an innovative structure that mirrors organizational communication.

At the heart of ChatFlock is the Conductor, a novel entity that determines the speaking order, enabling seamless coordination among AI and human participants. This orchestration allows for nuanced conversations and decision-making processes that go beyond traditional chat systems.

NOTE: We are still in a very early and experimental stage of development, so the library might be unstable and the API might change relatively frequently. As soon as we reach a stable version, everything will get properly tested and documented.

📝 Usage Examples

1-Participant Chat

2-Participant Chat

AI-Directed Multi-Participant Chat

End-to-End Examples

🚀 Features

  • Multi-Participant LLM-Based Chats: Enable rich, collaborative conversations with AI and human participants.
  • Conductor Orchestration: A unique system that manages turn-taking and dialogue flow, ensuring smooth chat progression.
  • Composition Generators: Smart modules that configure AI participants to achieve specific conversational goals.
  • Group-Based Participants (Hierarchical Chats): Implement sub-chats that handle complex queries internally before delivering concise responses. Enables hierarchical chat structures which can mimic human-like organizational communication.
  • Extensive LLM Toolkit Support: Fully compatible with existing LLM ecosystems like LangChain, enhancing their features for a robust chat experience.
  • Web Research Module: A sophisticated tool that conducts automated web research, leveraging selenium to analyze top search results.
  • BSHR (Brainstorm-Search-Hypothesize-Refine) Loop: An integrated module that employs the automated research tool in a loop using information literacy techniques for superior research outcomes (based on how humans do research). Credit: David Shapiro
  • Code Execution Tools: Facilitate the execution of direct code snippets within the chat, with support for both local and Docker environments.

🌟 What's Next?

  • Asynchronous Chat Support: enable non-real-time conversations, allowing for more flexible interaction timelines.
  • Automated AI Hierarchical Composition: Automatically configure AI participants in a complex company-like hierarchy to achieve specific conversational goals.
  • OpenAI Assistant Integration: Compatibility with OpenAI's latest features, expanding the library's AI capabilities.
  • Enhanced Code Execution: Advanced code execution features, including support for writing to files and more comprehensive execution environments.

💁 Contributing

As an open-source project in a rapidly developing field, we are extremely open to contributions, whether it be in the form of a new feature, improved infrastructure, or better documentation.

For detailed information on how to contribute, see here.

📈 Releases

You can see the list of available releases on the GitHub Releases page.

We follow Semantic Versions specification.

We use Release Drafter. As pull requests are merged, a draft release is kept up-to-date listing the changes, ready to publish when you’re ready. With the categories option, you can categorize pull requests in release notes using labels.

List of labels and corresponding titles

Label Title in Releases
enhancement, feature 🚀 Features
bug, refactoring, bugfix, fix 🔧 Fixes & Refactoring
build, ci, testing 📦 Build System & CI/CD
breaking 💥 Breaking Changes
documentation 📝 Documentation
dependencies ⬆️ Dependencies updates

GitHub creates the bug, enhancement, and documentation labels for you. Dependabot creates the dependencies label. Create the remaining labels on the Issues tab of your GitHub repository, when you need them.

Building and releasing the package

Building a new version of the application contains steps:

  • Bump the version of the package poetry version <version>. You can pass the new version explicitly, or a rule such as major, minor, or patch. For more details, refer to the Semantic Versions standard.
  • Make a commit to GitHub.
  • Create a GitHub release.
  • And... publish 🙂 poetry publish --build

🛡 License

License

This project is licensed under the terms of the MIT license. See LICENSE for more details.

📃 Citation

@misc{chat-flock,
  author = {doodledood},
  title = {Driving Dynamic Multi-Participant Chat Interactions for AI and Human Discourse},
  year = {2023},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/doodledood/chat-flock}}
}

❤️ Package Template Credits 🚀 Your next Python package needs a bleeding-edge project structure.

This project was generated with python-package-template

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chat_flock-0.1.2.tar.gz (41.4 kB view hashes)

Uploaded Source

Built Distribution

chat_flock-0.1.2-py3-none-any.whl (56.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page