Skip to main content

TAI [Terminal AI], a terminal AI assistant

Project description

TAI [Terminal AI], a terminal AI assistant

TAI is a CLI that helps you with Linux and macOS terminal commands. Just ask it a question, and it will use AI to suggest a command and explain what it does. If you like the suggestion, the script can automatically run the command for you in your terminal.

Demo

BREAKING

Changing the name from lfg to tai. The package name is changed from lfg-llama to terminal-ai-assistant.

Why & What?

  • Github Copilot CLI syntax feels clunky to me
  • Faster than using Gemini, ChatGPT or similar in a browser
  • Simpler to find answers without checking man pages
  • NEW: Switching to the free GPT-4o model
  • NEW: Now you can run commands right from this command-line interface
  • NEW: New package name terminal-ai-assistant

However, never trust the output entirely.

Installation

# install pipx
brew install pipx

# add pipx binaries to path
pipx ensurepath

# restart your terminal
# install TAI
pipx install terminal-ai-assistant

Usage

This executable is using OpenAI, that means you need and API token.

GPT-4o is free to use.

Add the token to your .bashrc/.zshrc and reload your terminal.

export OPENAI_API_KEY={replace_me}

You can use either of these commands

$ tai <query>

Now you can use the executable

$ tai kill port 3000

fuser -k 3000/tcp

Explanation:
The `fuser` command identifies processes using files or sockets. The `-k` option is used to kill th
ose processes. Here, `3000/tcp` specifies the TCP port number 3000. This command effectively kills
any process currently using port 3000.

> Execute the command? (N/y):

Change the LLM

$ tai get pods from all namespaces

kubectl get pods --all-namespaces


Explanation:
The `kubectl get pods --all-namespaces` command lists all the pods across all namespaces in a Kuber
netes cluster. The `--all-namespaces` flag is used to fetch the pods from every namespace instead of the default namespace.

> Execute the command? (N/y):

Development

pip install --user pipenv
pipenv --python 3.11
pipenv install

pipenv run tai kill port 3000

TODO

  • Fix the setup and pyproject file, including github workflow for releasing the package

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lfg_llama-3.0.1.tar.gz (4.0 kB view details)

Uploaded Source

Built Distribution

lfg_llama-3.0.1-py3-none-any.whl (4.4 kB view details)

Uploaded Python 3

File details

Details for the file lfg_llama-3.0.1.tar.gz.

File metadata

  • Download URL: lfg_llama-3.0.1.tar.gz
  • Upload date:
  • Size: 4.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.9

File hashes

Hashes for lfg_llama-3.0.1.tar.gz
Algorithm Hash digest
SHA256 4ec408f4be2e71fead30d6d80e3299d264052611828ffb201a184068f140eb06
MD5 9851043ac91ffa321fa38ded2051912a
BLAKE2b-256 d272e0faa23d2d90f2cd838c16379f6e8400c623f2e0bd0c7ac63c03ea4e24ba

See more details on using hashes here.

File details

Details for the file lfg_llama-3.0.1-py3-none-any.whl.

File metadata

  • Download URL: lfg_llama-3.0.1-py3-none-any.whl
  • Upload date:
  • Size: 4.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.9

File hashes

Hashes for lfg_llama-3.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 3609f2c72ec1d6c611c1bd10a93b2ce0a6dc43aa17ffe2e448493f536b0122c9
MD5 e7e4de95b160bd47236a40baa193401f
BLAKE2b-256 ee33444b4dff43b79dc41be6cedc8951872cae1b80ef54b1ca6aabfe28e5ca8e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page