Skip to main content

A CLI tool for multi-agent LLM tasks

Project description

FlowAI

A CLI tool for multi-agent LLM tasks.

Installation

  1. Ensure you have Poetry installed.
  2. Clone this repository.
  3. Run poetry install in the project directory.

Usage

First, initialize FlowAI:

poetry run flowai --init

This will guide you through an interactive setup process, allowing you to choose your default provider, model, quiet mode, stream mode, and other options using arrow keys for selection. The current settings will be displayed before the onboarding process starts, and you can preselect these values in the prompts.

Check the current status (provider, model, quiet mode, stream mode, flow, template file, context file, and final check):

poetry run flowai --status

Basic usage:

poetry run flowai "Your prompt here"

Specify a provider and model:

poetry run flowai --provider openai --model gpt-4 "Your prompt here"

Use quiet mode (only shows timer and final response):

poetry run flowai --quiet "Your prompt here"

or use the short flag:

poetry run flowai -q "Your prompt here"

Stream the output directly without waiting for full response:

poetry run flowai --stream "Your prompt here"

or use the short flag:

poetry run flowai -s "Your prompt here"

List available models for the current provider:

poetry run flowai --list-models

List available models for a specific provider:

poetry run flowai --provider anthropic --list-models

Use multiple agents to complete the task:

poetry run flowai --flow "Your prompt here"

Use a template file containing sections:

poetry run flowai --template-file path/to/template "Your prompt here"

Use a context file for global context:

poetry run flowai --context-file path/to/context "Your prompt here"

Run a final check after response assembly:

poetry run flowai --final-check "Your final check prompt" "Your prompt here"

Run poetry run flowai --help for more usage instructions.

Features

  • Interactive provider, model, quiet mode, stream mode, flow, template file, context file, and final check selection during setup
  • Display of current configuration during initialization
  • Pre-selection of current settings in setup prompts
  • Support for multiple LLM providers (OpenAI, Anthropic, Ollama)
  • Real-time animation with elapsed time display while waiting for response
  • Markdown rendering of responses in the terminal
  • Display of total round-trip response time, including connection setup
  • Easy-to-read formatted output
  • Detailed error reporting and graceful error handling
  • Configuration validation to ensure correct provider-model pairing
  • Quiet mode: show only the timer and final response
  • Stream mode: output the response directly without waiting for completion
  • Ability to override and update default settings for quiet and stream modes

Supported Providers

  • OpenAI (default): Dynamically fetches available models
  • Anthropic: Fetches available models from Anthropic API
  • Ollama: Fetches available models from local Ollama instance

You can easily extend FlowAI to support additional providers in the future.

Troubleshooting

If you encounter any issues while fetching models or sending prompts, FlowAI will display detailed error messages. Check your API keys and internet connection if you're having trouble connecting to a provider. If you see a configuration error, try running flowai --init to reconfigure FlowAI.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

glagos_flowai-0.1.0.tar.gz (9.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

glagos_flowai-0.1.0-py3-none-any.whl (10.8 kB view details)

Uploaded Python 3

File details

Details for the file glagos_flowai-0.1.0.tar.gz.

File metadata

  • Download URL: glagos_flowai-0.1.0.tar.gz
  • Upload date:
  • Size: 9.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.3 Darwin/23.1.0

File hashes

Hashes for glagos_flowai-0.1.0.tar.gz
Algorithm Hash digest
SHA256 707a7035e6c51b134142ea7d947b6e37f721c3f43e4977a9d5ab172841c3dc80
MD5 47402303834c391e480889934bda23bc
BLAKE2b-256 058ce08a70486986857b4657c298a75be2bcbca6c6d93ac7d947d86a3f802319

See more details on using hashes here.

File details

Details for the file glagos_flowai-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: glagos_flowai-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 10.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.3 CPython/3.12.3 Darwin/23.1.0

File hashes

Hashes for glagos_flowai-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1427686d4b48dec6db0d790d608606c89197171e024393f79fd4f40794e6c4c2
MD5 e18620b7468156640e9cf2f23b047c86
BLAKE2b-256 63611feef08804fefc33b7f16480f578c0d5bc6a16e145523e593026672b71a4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page