Skip to main content

Ask a local LLM (Ollama) for the command you forgot, then optionally run it.

Project description

uhh

PyPI Python versions Status License Publish

i forgor 💀 — ask your local LLM for the command you forgot, then run it.

uhh is a tiny CLI that asks a local Ollama instance for the command-line answer to a natural-language question, prints it, and optionally runs it. Cross-platform, zero runtime dependencies.

Demo

$ uhh how do i scp test.txt to mhills@172.16.0.1
$ scp test.txt mhills@172.16.0.1:~/
  # Copies test.txt to the remote host's home directory.

Run it? [y/N] y
test.txt    100%   12B  ...

It reads safe local context — your username, hostname, current directory, names of ~/.ssh/*.pub keys, ssh config Host aliases — so the suggested command uses your real values instead of placeholders.

If you ask about a different OS than your current one (e.g. you're on Linux but ask "how do I unlock the keychain on mac"), uhh produces the right command but skips the run prompt for safety.

Quoting your question

uhh reads your question from argv, so your shell parses it before uhh ever sees it. Two characters will trip you up if left bare:

  • Apostrophes (') — uhh how do I keep my mac from sleeping when I don't want it to will leave your shell sitting on a continuation prompt waiting for the closing '. Wrap the question in double quotes, escape the apostrophe, or just rewrite without the contraction.
  • Shell metacharacters (|, >, <, &, ;, $, backticks) — these get interpreted by the shell. Quote the question if you're including any of them.
# bad — shell hangs on the apostrophe:
uhh how do I stop my mac sleeping when I don't want it to

# good — any of these works:
uhh "how do I stop my mac sleeping when I don't want it to"
uhh how do I stop my mac sleeping when I don\'t want it to
uhh how do I stop my mac sleeping when I do not want it to

Install

pip install uhh

Or with pipx for an isolated install:

pipx install uhh

You also need Ollama running somewhere reachable, plus at least one chat model. The first-run wizard walks you through this.

First run

Any first invocation triggers a one-time setup wizard:

$ uhh how do i list listening ports

It looks like this is the first time uhh has run —
you need to configure a few things.

Ollama host [http://localhost:11434]:
  ✓ reachable, 8 model(s) installed

Recommended for command lookup:
  [1] qwen2.5-coder:14b-instruct-q4_K_M  ✓ installed   best for shell commands
  [2] qwen3:14b                          ✓ installed   newer general-purpose
  [3] llama3.1:8b                        ✓ installed   fast, solid all-rounder
  [4] qwen2.5-coder:7b                   pull (4.4 GB) smaller coder model
  [5] llama3.2:3b                        pull (2.0 GB) tiny + very fast

Pick a default model [1]:

After picking a model (and pulling it if needed), your original question is answered. To re-run setup later, delete the config file and run uhh ... again.

Configuration

Config lives at:

  • Linux / macOS: ~/.config/uhh/config.toml
  • Windows: %APPDATA%\uhh\config.toml

Configure multiple Ollama instances ("profiles") and switch between them:

default_profile = "local"

[profiles.local]
host  = "http://localhost:11434"
model = "qwen2.5-coder:14b-instruct-q4_K_M"

[profiles.homelab]
host  = "http://homelab.lan:11434"
model = "qwen3:14b"
# api_key = "..."   # optional bearer token if Ollama is behind a proxy

Then:

uhh --profile homelab "rotate this nginx cert"

Override hierarchy (highest wins): --host / --model / --profile flags → UHH_HOST / UHH_MODEL / UHH_PROFILE env vars → config file.

Useful flags

Flag Purpose
--no-run Print the command but skip the y/N prompt
-y, --yes Run without asking (only for same-OS commands)
--show-context Print the system facts being sent to the model
--no-context Don't send any system facts
--shell SHELL Override shell-dialect detection (bash/zsh/fish/powershell)
--list-profiles List configured profiles
--config Print the config file path

What gets sent to the model

A small, read-only snapshot of your machine is included in the prompt by default so suggestions use your real values:

  • Hostname, username, OS, current working directory
  • Filenames of public keys in ~/.ssh/ (e.g. id_ed25519.pub) — never key contents
  • Host alias names from ~/.ssh/config — never destination hostnames

Nothing is sent to a third party — only to the Ollama instance you configured. Disable entirely with --no-context.

Development

Install the latest unreleased code from the develop branch:

pip install git+https://github.com/mattintech/uhh.git@develop

Or with pipx:

pipx install git+https://github.com/mattintech/uhh.git@develop

Re-install from a different branch (e.g. a feature branch) — pipx upgrade won't switch refs, so use --force:

pipx install --force git+https://github.com/mattintech/uhh.git@feature/pypiready

For pip, add --force-reinstall:

pip install --force-reinstall git+https://github.com/mattintech/uhh.git@feature/pypiready

Check the installed version anytime with uhh --version.

Branches:

  • main — tracks released versions; tag a release here to publish to PyPI
  • develop — integration branch for in-progress work; install from here to try unreleased changes

Releasing: bump nothing — versions come from git tags via hatch-vcs. Merge developmain, then GitHub UI → Releases → Draft → tag vX.Y.Z → Publish. The publish.yml workflow ships it to PyPI.

Requirements

  • Python 3.11+ (uses stdlib tomllib)
  • Ollama reachable locally or remotely
  • A chat-tuned model (the wizard helps install one)

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

uhh-0.1.2.tar.gz (13.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

uhh-0.1.2-py3-none-any.whl (14.4 kB view details)

Uploaded Python 3

File details

Details for the file uhh-0.1.2.tar.gz.

File metadata

  • Download URL: uhh-0.1.2.tar.gz
  • Upload date:
  • Size: 13.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for uhh-0.1.2.tar.gz
Algorithm Hash digest
SHA256 f6f3b7a9bd82d47ee09fb96b28ef563d80b0eaecca2ea44fced57ab63b716883
MD5 e2e22786737552358573a01b5cc6b7e0
BLAKE2b-256 e2f7c46b3c11a99020531cd4aa010234242ce64936b74969860902216c9fbad3

See more details on using hashes here.

Provenance

The following attestation bundles were made for uhh-0.1.2.tar.gz:

Publisher: publish.yml on mattintech/uhh

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file uhh-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: uhh-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 14.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for uhh-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 06108293eae6da994795bf6ab59f6fc46d21b311aa9bc20ce52fd925911e408d
MD5 9711e2e9745d97aaed92fdfab5004f7b
BLAKE2b-256 c69790a298311b1988b8e78e58ae8fc8e012c45c890162151e1149797a2242dc

See more details on using hashes here.

Provenance

The following attestation bundles were made for uhh-0.1.2-py3-none-any.whl:

Publisher: publish.yml on mattintech/uhh

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page