Skip to main content

Ask a local LLM (Ollama) for the command you forgot, then optionally run it.

Project description

uhh

PyPI Python versions Status License Publish

i forgor 💀 — ask your local LLM for the command you forgot, then run it.

uhh is a tiny CLI that asks a local Ollama instance for the command-line answer to a natural-language question, prints it, and optionally runs it. Cross-platform, zero runtime dependencies.

Demo

$ uhh how do i scp test.txt to mhills@172.16.0.1
$ scp test.txt mhills@172.16.0.1:~/
  # Copies test.txt to the remote host's home directory.

Run it? [y/N] y
test.txt    100%   12B  ...

It reads safe local context — your username, hostname, current directory, names of ~/.ssh/*.pub keys, ssh config Host aliases — so the suggested command uses your real values instead of placeholders.

If you ask about a different OS than your current one (e.g. you're on Linux but ask "how do I unlock the keychain on mac"), uhh produces the right command but skips the run prompt for safety.

Install

pip install uhh

Or with pipx for an isolated install:

pipx install uhh

You also need Ollama running somewhere reachable, plus at least one chat model. The first-run wizard walks you through this.

First run

Any first invocation triggers a one-time setup wizard:

$ uhh how do i list listening ports

It looks like this is the first time uhh has run —
you need to configure a few things.

Ollama host [http://localhost:11434]:
  ✓ reachable, 8 model(s) installed

Recommended for command lookup:
  [1] qwen2.5-coder:14b-instruct-q4_K_M  ✓ installed   best for shell commands
  [2] qwen3:14b                          ✓ installed   newer general-purpose
  [3] llama3.1:8b                        ✓ installed   fast, solid all-rounder
  [4] qwen2.5-coder:7b                   pull (4.4 GB) smaller coder model
  [5] llama3.2:3b                        pull (2.0 GB) tiny + very fast

Pick a default model [1]:

After picking a model (and pulling it if needed), your original question is answered. To re-run setup later, delete the config file and run uhh ... again.

Configuration

Config lives at:

  • Linux / macOS: ~/.config/uhh/config.toml
  • Windows: %APPDATA%\uhh\config.toml

Configure multiple Ollama instances ("profiles") and switch between them:

default_profile = "local"

[profiles.local]
host  = "http://localhost:11434"
model = "qwen2.5-coder:14b-instruct-q4_K_M"

[profiles.homelab]
host  = "http://homelab.lan:11434"
model = "qwen3:14b"
# api_key = "..."   # optional bearer token if Ollama is behind a proxy

Then:

uhh --profile homelab "rotate this nginx cert"

Override hierarchy (highest wins): --host / --model / --profile flags → UHH_HOST / UHH_MODEL / UHH_PROFILE env vars → config file.

Useful flags

Flag Purpose
--no-run Print the command but skip the y/N prompt
-y, --yes Run without asking (only for same-OS commands)
--show-context Print the system facts being sent to the model
--no-context Don't send any system facts
--shell SHELL Override shell-dialect detection (bash/zsh/fish/powershell)
--list-profiles List configured profiles
--config Print the config file path

What gets sent to the model

A small, read-only snapshot of your machine is included in the prompt by default so suggestions use your real values:

  • Hostname, username, OS, current working directory
  • Filenames of public keys in ~/.ssh/ (e.g. id_ed25519.pub) — never key contents
  • Host alias names from ~/.ssh/config — never destination hostnames

Nothing is sent to a third party — only to the Ollama instance you configured. Disable entirely with --no-context.

Development

Install the latest unreleased code from the develop branch:

pip install git+https://github.com/mattintech/uhh.git@develop

Or with pipx:

pipx install git+https://github.com/mattintech/uhh.git@develop

Re-install from a different branch (e.g. a feature branch) — pipx upgrade won't switch refs, so use --force:

pipx install --force git+https://github.com/mattintech/uhh.git@feature/pypiready

For pip, add --force-reinstall:

pip install --force-reinstall git+https://github.com/mattintech/uhh.git@feature/pypiready

Check the installed version anytime with uhh --version.

Branches:

  • main — tracks released versions; tag a release here to publish to PyPI
  • develop — integration branch for in-progress work; install from here to try unreleased changes

Releasing: bump nothing — versions come from git tags via hatch-vcs. Merge developmain, then GitHub UI → Releases → Draft → tag vX.Y.Z → Publish. The publish.yml workflow ships it to PyPI.

Requirements

  • Python 3.11+ (uses stdlib tomllib)
  • Ollama reachable locally or remotely
  • A chat-tuned model (the wizard helps install one)

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

uhh-0.1.1.tar.gz (10.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

uhh-0.1.1-py3-none-any.whl (12.8 kB view details)

Uploaded Python 3

File details

Details for the file uhh-0.1.1.tar.gz.

File metadata

  • Download URL: uhh-0.1.1.tar.gz
  • Upload date:
  • Size: 10.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for uhh-0.1.1.tar.gz
Algorithm Hash digest
SHA256 2495ee8f1356adb34f28954b77b2fd1155e2a794ad8bd9015f77958d4df2416e
MD5 e1835940f38bbb9608dd6b2310068168
BLAKE2b-256 69b82b5ad667444d64263ffbe5ae675162a8687e98c329660ba8d5961c07548f

See more details on using hashes here.

Provenance

The following attestation bundles were made for uhh-0.1.1.tar.gz:

Publisher: publish.yml on mattintech/uhh

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file uhh-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: uhh-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 12.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for uhh-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 b4458a1f50f85501ee45d722a41f902cda3244cafd75830575a2231797a96288
MD5 fa61fa73b69dcc0e7ea1787bb1d335c0
BLAKE2b-256 8048dfb44615f8977de4367353bc0d0eb4c4063a62d1a36d6f44bc8efc2789c2

See more details on using hashes here.

Provenance

The following attestation bundles were made for uhh-0.1.1-py3-none-any.whl:

Publisher: publish.yml on mattintech/uhh

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page