A Python CLI tool for streamlining development workflows with LLMs - commit messages, PR descriptions, code reviews and more
Project description
FlowAI
FlowAI is a Python-based CLI tool that helps developers streamline their development workflow by automating common tasks using LLMs (Language Learning Models).
Features
- Generate detailed commit messages from git diffs
- Create comprehensive pull request descriptions
- Perform automated code reviews
- Interactive chat mode with streaming support
- Web search capability with Gemini models
- Thinking mode for Gemini 2.5 models
- Support for multiple LLM providers:
- OpenAI
- Anthropic
- Groq
- Gemini
- Ollama
- Cross-platform compatibility (Windows, Mac, Linux)
- Markdown rendering in terminal
- Streaming responses for real-time feedback
- Configurable output formatting per command
Command System
FlowAI uses a powerful command system to automate common tasks. Commands are defined in ~/flowai-prompts/prompt-index.txt and can be customized to your needs.
Command Features
- Pre-configured context gathering
- Template-based prompts
- Interactive user input
- Platform-specific variants (Windows/Unix)
- Configurable output formatting:
markdown- Rich formatted output (default)raw- Plain text output (ideal for commit messages, PR descriptions)
Example Commands
# Generate a commit message for staged changes (raw output)
flowai --command staged-commit-message
# Review code changes (markdown formatted)
flowai --command staged-code-review
# Create PR description (raw output)
flowai --command pull-request
Chat Mode Features
FlowAI's chat mode is a powerful way to interact with the AI assistant. You can:
- Start a direct chat session:
flowai --chat
- Turn any command into a chat session by adding
--chat:
# Start with a code review and continue chatting about it
flowai --command staged-code-review --chat
# Generate a commit message and discuss it
flowai --command staged-commit-message --chat
# Create a PR description and refine it through chat
flowai --command pull-request --chat
When using --chat with a command, FlowAI will:
- Execute the command normally first
- Use the command's output as context for a new chat session
- Allow you to discuss, refine, or ask questions about the output
- Keep the original context (e.g., git diff, code changes) available for reference
Chat Features
- Stream mode toggle (
/stream,/stream on,/stream off) - Web search capability with Gemini models
- Thinking mode for Gemini 2.5 models
- Token usage tracking
- Real-time response streaming
- Command system for common operations
- Chat history persistence
- Markdown rendering
- Loading indicators with timing information
Chat Commands
/help- Show available commands/quit- Exit chat mode/clear- Clear chat history/stream- Toggle stream mode/stream on- Enable stream mode/stream off- Disable stream mode
Known Issues
We are actively working on fixing several issues in the chat mode:
- Ctrl+C handling may not work correctly in some scenarios
- Status display (tokens and stream mode) may not show correctly in some terminals
- Double "Generating response..." message may appear
- Some formatting issues with streamed responses
- Terminal compatibility issues with certain commands
Please check our TODO.md file for a complete list of issues being tracked.
Installation
pip install flowai
Configuration
Run the initial setup:
flowai --init
This will guide you through:
- Setting up API keys
- Choosing your default model
- Configuring stream mode preferences
Usage
Basic Commands
# Start chat mode
flowai --chat
# pipe output into flowai as context
git diff | flowai "summarise these changes in 1 paragraph"
# ask any question
flowai "how do i do a git rebase? Is it dangerous? Be concise"
# Generate commit message for staged changes (raw output)
flowai --command staged-commit-message
# Review staged changes (markdown formatted)
flowai --command staged-code-review
# Get help (markdown formatted)
flowai --command help
# Get specific help on any flowai feature
flowai --command help "how do i create a custom command that will work in windows and unix style platforms?"
# Use web search (only works with Gemini models)
flowai --web-search "What are the latest developments in quantum computing?"
# Enable thinking mode for Gemini 2.5 models
flowai --model gemini/gemini-2.5-pro --thinking-budget 2048 "Analyze this code for security issues"
Advanced Features
Web Search
FlowAI supports web search capabilities with Google's Gemini models, allowing you to access up-to-date information from the internet.
# Enable web search (only works with Gemini models)
flowai --web-search "What are the latest developments in quantum computing?"
# Use in chat mode
flowai --chat --web-search
# Combine with commands
flowai --command help --web-search "What are the latest features in FlowAI?"
When web search is enabled:
- The current date and time are included in the prompt
- The AI will cite its sources in a dedicated "Sources" section
- Citations include webpage titles and URLs
- Only works with Gemini models (automatically disabled for other models)
Thinking Mode (Gemini 2.5 Models)
Gemini 2.5 models support a "thinking mode" that allows the AI to perform more thorough reasoning before responding.
# Enable thinking mode with a specific budget (1024+ recommended)
flowai --model gemini/gemini-2.5-pro --thinking-budget 2048 "Analyze this code for security issues"
# Use in chat mode
flowai --chat --thinking-budget 2048
# Combine with commands
flowai --command code-review --thinking-budget 2048
Thinking mode:
- Allows the AI to perform more thorough reasoning
- Higher budgets (1024+) enable more complex thinking
- Setting to 0 disables thinking mode
- Status is displayed in the terminal output
- Only works with Gemini 2.5 models
Image Generation
FlowAI supports generating images using Google's Gemini models. You can create images from text prompts and refine them interactively.
# Basic image generation
flowai --create-image "A futuristic spaceship hovering over the surface of Mars"
# Use a reference image to guide generation
flowai --create-image "A futuristic spaceship hovering over the surface of Mars" --reference-image path/to/image.jpg
# Use an image from clipboard as reference
flowai --create-image "A futuristic spaceship hovering over the surface of Mars" --reference-from-clipboard
# Enter interactive chat mode to refine the image
flowai --create-image "A futuristic spaceship hovering over the surface of Mars" --chat
You can also provide a reference image without explicitly using the --create-image flag:
# These automatically enable image generation mode
flowai --reference-image path/to/image.jpg "A futuristic spaceship hovering over the surface of Mars"
flowai --reference-from-clipboard "A futuristic spaceship hovering over the surface of Mars"
In interactive chat mode, you can:
- Type refinement instructions to modify the current image
- Use
/helpto see available commands - Use
/reference <path>to set a new reference image - Use
/clipboardto use clipboard image as reference - Type
/quitto exit chat mode
For more details, see Image Generation Guide.
Output Formatting
Commands can be configured to output in either markdown or raw format:
- Markdown format: Rich text with formatting, ideal for reviews and documentation
- Raw format: Plain text, perfect for commit messages and PR descriptions
You can:
- Set format per command in
prompt-index.txt - Override with
--no-markdownflag - Default to markdown if not specified
Chat Commands
/help- Show available commands/quit- Exit chat mode/clear- Clear chat history/stream- Toggle stream mode/stream on- Enable stream mode/stream off- Disable stream mode
Contributing
Please see our CONTRIBUTING.md for guidelines on how to contribute to this project.
License
MIT License - see LICENSE for details
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file glagos_flowai-0.5.99.tar.gz.
File metadata
- Download URL: glagos_flowai-0.5.99.tar.gz
- Upload date:
- Size: 50.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.12.9 Darwin/23.1.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3ffae6518322c1f30330ede980b64195a7b50e1195bbb7a6c91ba5d4ac8dc71b
|
|
| MD5 |
8652180d8edba9a73bd6d631d4f4dbb0
|
|
| BLAKE2b-256 |
5db649d0c8390dc2beebe0f1aa64870d9fdabae0b8847d6c7631ae9f2a43999a
|
File details
Details for the file glagos_flowai-0.5.99-py3-none-any.whl.
File metadata
- Download URL: glagos_flowai-0.5.99-py3-none-any.whl
- Upload date:
- Size: 55.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.12.9 Darwin/23.1.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8bd71e2ca7f84a0256aebcdc5b7dcfb0a7d08192bc7cde3c06f4727db0b13b9a
|
|
| MD5 |
f2ecf71396941ba3cbc98fef8569640d
|
|
| BLAKE2b-256 |
6951f1c15fa36c1caf120476bc1076d9d0a34d06fedc129a6a42900dc8faf65d
|