Skip to main content

A CLI tool for multi-agent LLM tasks

Project description

FlowAI

A CLI tool for multi-agent LLM tasks.

Installation

Run pipx install glagos-flowai in the project directory.

Usage

Initialization

First, initialize FlowAI:

flowai --init

This will guide you through an interactive setup process, allowing you to choose your default model, stream mode, and other options using arrow keys for selection. The current settings will be displayed before the onboarding process starts, and you can preselect these values in the prompts.

Check Current Status

To check the current status, including the model, stream mode, flow, template file, context file, and final check:

flowai --status

Basic Usage

To run FlowAI with a simple prompt:

flowai "Your prompt here"

Model Selection

To specify a model:

flowai --model openai:gpt-4 "Your prompt here"

To list available models for all providers:

flowai --list-models

Streaming

To stream the output directly without waiting for the full response:

flowai --stream "Your prompt here"

Multi-Agent Flow

To use multiple agents to complete the task:

flowai --flow "Your prompt here"

Templates

To use a template file containing sections:

flowai --template-file path/to/template "Your prompt here"

To select a prompt file from the flowai-prompts directory:

flowai --select-prompt-file

Context Options

To use a context file for global context:

flowai --context-file path/to/context "Your prompt here"

To run a shell command to generate context:

flowai --context-shell-command "your-command" "Your prompt here"

To set context from the system clipboard:

flowai --context-from-clipboard "Your prompt here"

Final Check

To run a final check after response assembly:

flowai --final-check "Your final check prompt" "Your prompt here"

Debugging

To enable debug mode to display prompts:

flowai --debug "Your prompt here"

Markdown Options

To return the response without Markdown formatting:

flowai --no-markdown "Your prompt here"

Features

  • Interactive Setup: Interactive model, stream mode, flow, template file, context file, and final check selection during setup.
  • Configuration Display: Display of current configuration during initialization.
  • Pre-Selection: Pre-selection of current settings in setup prompts.
  • Multiple LLM Providers: Support for multiple LLM providers (OpenAI, Anthropic, Groq, Google, Ollama).
  • Real-Time Animation: Real-time animation with elapsed time display while waiting for response.
  • Markdown Rendering: Markdown rendering of responses in the terminal.
  • Response Time Display: Display of total round-trip response time, including connection setup.
  • Formatted Output: Easy-to-read formatted output.
  • Error Reporting: Detailed error reporting and graceful error handling.
  • Configuration Validation: Configuration validation to ensure correct provider-model pairing.
  • Stream Mode Override: Ability to override and update default settings for stream mode.
  • Context Options: Multiple options for setting context (file, shell command, clipboard).
  • Template Support: Support for using template files and selecting prompt files from a directory.

Supported Providers

  • OpenAI: Dynamically fetches available models.
  • Anthropic: Fetches available models from Anthropic API.
  • Groq: Fetches available models from Groq API.
  • Google: Fetches available models from Google API.
  • Ollama: Fetches available models from local Ollama instance.

You can easily extend FlowAI to support additional providers in the future.

Troubleshooting

If you encounter any issues while fetching models or sending prompts, FlowAI will display detailed error messages. Check your API keys and internet connection if you're having trouble connecting to a provider. If you see a configuration error, try running flowai --init to reconfigure FlowAI.

Contributing

We welcome contributions from the community! If you're familiar with a model that isn't currently supported, we'd love your help in integrating it into the library. The library could also use some unit tests. Here's how you can contribute:

  1. Fork the Repository: Start by forking the repository.
  2. Clone the Forked Repository: Clone the forked repository to your local machine and switch into its directory.
  3. Create a New Branch: Create a new branch for each feature or bug fix you're working on.
  4. Make Your Changes: Make the necessary changes in the new branch.
  5. Test Your Changes: Make sure your changes do not break any existing functionality. Add new tests if necessary.
  6. Commit and Push Your Changes: Once you're happy with your changes, commit them and push the branch to your forked repository on GitHub.
  7. Create a Pull Request: Navigate to the original repository and create a pull request. Explain the changes you made, why you believe they're necessary, and any other information you think might be helpful.

After you've submitted your pull request, the maintainers will review your changes. You might be asked to make some additional modifications or provide more context about your changes. Once everything is approved, your changes will be merged into the main branch.

We value all our contributors and are grateful for any time you can spare to help improve FlowAI. Happy coding!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

glagos_flowai-0.2.0.tar.gz (12.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

glagos_flowai-0.2.0-py3-none-any.whl (13.1 kB view details)

Uploaded Python 3

File details

Details for the file glagos_flowai-0.2.0.tar.gz.

File metadata

  • Download URL: glagos_flowai-0.2.0.tar.gz
  • Upload date:
  • Size: 12.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.3 Darwin/23.1.0

File hashes

Hashes for glagos_flowai-0.2.0.tar.gz
Algorithm Hash digest
SHA256 59201dc492aebafbe26f795e2c4d6f9aef2dd6d6f454d33281f59ea68223e111
MD5 ff7a3c0277e5617775a397d3b9ace364
BLAKE2b-256 b18df9bf65ec354b999cf7d70a09dc7ff8e8d53c2de70a6fc3ae3f5db6ef5164

See more details on using hashes here.

File details

Details for the file glagos_flowai-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: glagos_flowai-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 13.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.3 Darwin/23.1.0

File hashes

Hashes for glagos_flowai-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 da73b510c26cf74d71c7be126306cc30db03624dc3585a64f122216eb06ffb07
MD5 a7b37f5d1f794f39c84b9980116a8aba
BLAKE2b-256 629762cc7300c83dfc190f4f0fa04864d2e52427bde61e2212dedcfce3e92f21

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page