Add your description here
Project description
Fid
AI for the command line, built for pipelines.
Fid brings Large Language Models (LLMs) directly to your terminal. It can take command output, process it with AI, and return results in formats like Markdown, JSON, or plain text. Think of it as a lightweight way to make your command-line workflows smarter with just a touch of AI.
You can also use OpenAI, Cohere, Groq, or Azure OpenAI.
Installation
uv pip install fid
What Can It Do?
Fid reads from standard input and pairs it with a prompt you provide via command-line arguments. That input is then sent to an LLM, which generates an answer, formatted in Markdown.
You can configure models in your settings file by running
fid --settings.
Examples
1. Summarize logs
cat /var/log/syslog | fid summarize the key errors
2. Shell commands
fid explain "ls -lh /etc"
3. Get one-liner shell fixes (using a custom role)
fid --role shell find all .py files modified in the last 24 hours
Usage
-m,--model: Specify Large Language Model to use--role: Specify the role to use (See custom roles)--list-roles: List the roles defined in configuration--settings: Open settings--dirs: Print directories where config is stored--reset-settings: Restore settings to default--version: Show version
Custom Roles
Roles allow you to set system prompts. Here is an example of a shell role:
roles:
shell:
- you are a shell expert
- you do not explain anything
- you simply output one liners to solve the problems you're asked
- you do not provide any explanation whatsoever, ONLY the command
Then, use the custom role in fid:
fid --role shell list files in the current directory
Setup
Gemini
Fid uses gemini-2.0-flash by default.
Set the GOOGLE_API_KEY enviroment variable. If you don't have one yet,
you can get it from the Google AI Studio.
Open AI
Set the OPENAI_API_KEY environment variable. If you don't have one yet, you
can grab it the OpenAI website.
Alternatively, set the [AZURE_OPENAI_KEY] environment variable to use Azure
OpenAI. Grab a key from Azure.
Cohere
Cohere provides enterprise optimized models.
Set the COHERE_API_KEY environment variable. If you don't have one yet, you can
get it from the Cohere dashboard.
Groq
Groq provides models powered by their LPU inference engine.
Set the GROQ_API_KEY environment variable. If you don't have one yet, you can
get it from the Groq console.
License
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file fid-0.1.0.tar.gz.
File metadata
- Download URL: fid-0.1.0.tar.gz
- Upload date:
- Size: 6.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cdb818c6b217ad8e37b63c9b79df38a810fe6cf89c3714504c4b4e1526863e0c
|
|
| MD5 |
0f31f75dd9999f8c345ba7918ea67472
|
|
| BLAKE2b-256 |
8cb55309a3382dcd255daae9d2b9f90b84168d625a2474643d372cd3c9a5c8de
|
File details
Details for the file fid-0.1.0-py3-none-any.whl.
File metadata
- Download URL: fid-0.1.0-py3-none-any.whl
- Upload date:
- Size: 6.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6c3d30ab9fa3711f83c3033560cb9fb766174498ca564749cb317d4b1b84dfdf
|
|
| MD5 |
88b1e16c62f275fa07ed03ebae6ef818
|
|
| BLAKE2b-256 |
316bd09706f3488de0e904ff8d81a0a4625b0dd6917964553e8399d493cb2b23
|