Skip to main content

LFG, It Really Whips the Llama's Ass 🦙🦙🦙🦙

Project description

LFG

LFG, It Really Whips the Llama's Ass 🦙🦙🦙🦙

Demo

LFG is a command-line tool that intelligently helps you find the right terminal commands for your tasks. Such sales pitch. This interface is using GPT-4o as an engine.

Why?

  • Firstly, this was created to test Ollama -> Groq
  • I do not like the Github Copilot command-line
  • Quicker than using Gemini/ChatGPT/Google directly via the browser interface
  • Easier to find what needed without opening man pages
  • NEW: Changing to GPT-4o model which is free

However, never trust the output entirely.

Installation

# install pipx
brew install pipx

# add pipx binaries to path
pipx ensurepath

# restart your terminal
# install LFG
pipx install lfg-llama

Usage

This executable is using OpenAI, that means you need and API token.

GPT-4o is free to use.

Add the token to your .bashrc/.zshrc and reload your terminal.

OPENAI_API_KEY={replace_me}
$ lfg query

Now you can use the executable

lfg "kill port 3000"

# Kill process listening on port 3000
lsof -i :3000 | xargs kill

Change the LLM

$ lfg "list ec2 pipe json jq get name" -m llama370b

# List EC2 instances with name

aws ec2 describe-instances --query 'Reservations[].Instances[]|{Name:Tags[?Key==`Name`]|[0].Value,I
nstanceId}' --output text | jq '.[] | {"Name", .Name, "InstanceId", .InstanceId}'

This command uses the AWS CLI to describe EC2 instances, and then pipes the output to `jq` to format the output in a JSON-like format, showing the instance name and ID.

Development

pip install --user pipenv
pipenv --python 3.11
pipenv install

pipenv run lfg "kill port 3000"

TODO

  • Fix the setup and pyproject file, including github workflow for releasing the package

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lfg_llama-2.0.1.tar.gz (3.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lfg_llama-2.0.1-py3-none-any.whl (3.9 kB view details)

Uploaded Python 3

File details

Details for the file lfg_llama-2.0.1.tar.gz.

File metadata

  • Download URL: lfg_llama-2.0.1.tar.gz
  • Upload date:
  • Size: 3.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.9

File hashes

Hashes for lfg_llama-2.0.1.tar.gz
Algorithm Hash digest
SHA256 634a23985712f71757d11e8e3fa0ffa7b7970a98c45d805c62976978b889bd98
MD5 289e5b7c87ac178ba6fe1beed702d9e1
BLAKE2b-256 5072ce20dec7b15cc0dcfbee5e972a923a0814c803e2bac806806f37890a6266

See more details on using hashes here.

File details

Details for the file lfg_llama-2.0.1-py3-none-any.whl.

File metadata

  • Download URL: lfg_llama-2.0.1-py3-none-any.whl
  • Upload date:
  • Size: 3.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.11.9

File hashes

Hashes for lfg_llama-2.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 01bd2cc6d959e9848f64a321b2d65e12370c49a2abc44bed48bb88a5271122f7
MD5 98d5e3dafccb62380c34982a0d68d41c
BLAKE2b-256 8a078de38e51fe21e282ffe93142b1f5cfd1b248ba2070c75aef8076bf8b8533

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page