Ask a local LLM (Ollama) for the command you forgot, then optionally run it.
Project description
uhh
i forgor 💀 — ask your local LLM for the command you forgot, then run it.
uhh is a tiny CLI that asks a local Ollama instance for the command-line answer to a natural-language question, prints it, and optionally runs it. Cross-platform, zero runtime dependencies.
Demo
$ uhh how do i scp test.txt to mhills@172.16.0.1
$ scp test.txt mhills@172.16.0.1:~/
# Copies test.txt to the remote host's home directory.
Run it? [y/N] y
test.txt 100% 12B ...
It reads safe local context — your username, hostname, current directory, names of ~/.ssh/*.pub keys, ssh config Host aliases — so the suggested command uses your real values instead of placeholders.
If you ask about a different OS than your current one (e.g. you're on Linux but ask "how do I unlock the keychain on mac"), uhh produces the right command but skips the run prompt for safety.
Install
pip install uhh
Or with pipx for an isolated install:
pipx install uhh
You also need Ollama running somewhere reachable, plus at least one chat model. The first-run wizard walks you through this.
First run
Any first invocation triggers a one-time setup wizard:
$ uhh how do i list listening ports
It looks like this is the first time uhh has run —
you need to configure a few things.
Ollama host [http://localhost:11434]:
✓ reachable, 8 model(s) installed
Recommended for command lookup:
[1] qwen2.5-coder:14b-instruct-q4_K_M ✓ installed best for shell commands
[2] qwen3:14b ✓ installed newer general-purpose
[3] llama3.1:8b ✓ installed fast, solid all-rounder
[4] qwen2.5-coder:7b pull (4.4 GB) smaller coder model
[5] llama3.2:3b pull (2.0 GB) tiny + very fast
Pick a default model [1]:
After picking a model (and pulling it if needed), your original question is answered. To re-run setup later, delete the config file and run uhh ... again.
Configuration
Config lives at:
- Linux / macOS:
~/.config/uhh/config.toml - Windows:
%APPDATA%\uhh\config.toml
Configure multiple Ollama instances ("profiles") and switch between them:
default_profile = "local"
[profiles.local]
host = "http://localhost:11434"
model = "qwen2.5-coder:14b-instruct-q4_K_M"
[profiles.homelab]
host = "http://homelab.lan:11434"
model = "qwen3:14b"
# api_key = "..." # optional bearer token if Ollama is behind a proxy
Then:
uhh --profile homelab "rotate this nginx cert"
Override hierarchy (highest wins): --host / --model / --profile flags → UHH_HOST / UHH_MODEL / UHH_PROFILE env vars → config file.
Useful flags
| Flag | Purpose |
|---|---|
--no-run |
Print the command but skip the y/N prompt |
-y, --yes |
Run without asking (only for same-OS commands) |
--show-context |
Print the system facts being sent to the model |
--no-context |
Don't send any system facts |
--shell SHELL |
Override shell-dialect detection (bash/zsh/fish/powershell) |
--list-profiles |
List configured profiles |
--config |
Print the config file path |
What gets sent to the model
A small, read-only snapshot of your machine is included in the prompt by default so suggestions use your real values:
- Hostname, username, OS, current working directory
- Filenames of public keys in
~/.ssh/(e.g.id_ed25519.pub) — never key contents - Host alias names from
~/.ssh/config— never destination hostnames
Nothing is sent to a third party — only to the Ollama instance you configured. Disable entirely with --no-context.
Development
Install the latest unreleased code from the develop branch:
pip install git+https://github.com/mattintech/uhh.git@develop
Or with pipx:
pipx install git+https://github.com/mattintech/uhh.git@develop
Re-install from a different branch (e.g. a feature branch) — pipx upgrade won't switch refs, so use --force:
pipx install --force git+https://github.com/mattintech/uhh.git@feature/pypiready
For pip, add --force-reinstall:
pip install --force-reinstall git+https://github.com/mattintech/uhh.git@feature/pypiready
Check the installed version anytime with uhh --version.
Branches:
main— tracks released versions; tag a release here to publish to PyPIdevelop— integration branch for in-progress work; install from here to try unreleased changes
Releasing: bump nothing — versions come from git tags via hatch-vcs. Merge develop → main, then GitHub UI → Releases → Draft → tag vX.Y.Z → Publish. The publish.yml workflow ships it to PyPI.
Requirements
- Python 3.11+ (uses stdlib
tomllib) - Ollama reachable locally or remotely
- A chat-tuned model (the wizard helps install one)
License
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file uhh-0.1.1.tar.gz.
File metadata
- Download URL: uhh-0.1.1.tar.gz
- Upload date:
- Size: 10.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2495ee8f1356adb34f28954b77b2fd1155e2a794ad8bd9015f77958d4df2416e
|
|
| MD5 |
e1835940f38bbb9608dd6b2310068168
|
|
| BLAKE2b-256 |
69b82b5ad667444d64263ffbe5ae675162a8687e98c329660ba8d5961c07548f
|
Provenance
The following attestation bundles were made for uhh-0.1.1.tar.gz:
Publisher:
publish.yml on mattintech/uhh
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
uhh-0.1.1.tar.gz -
Subject digest:
2495ee8f1356adb34f28954b77b2fd1155e2a794ad8bd9015f77958d4df2416e - Sigstore transparency entry: 1384405018
- Sigstore integration time:
-
Permalink:
mattintech/uhh@e43ec850bb588f969d52570bee49bf7bc0332363 -
Branch / Tag:
refs/tags/v0.1.1 - Owner: https://github.com/mattintech
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@e43ec850bb588f969d52570bee49bf7bc0332363 -
Trigger Event:
release
-
Statement type:
File details
Details for the file uhh-0.1.1-py3-none-any.whl.
File metadata
- Download URL: uhh-0.1.1-py3-none-any.whl
- Upload date:
- Size: 12.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b4458a1f50f85501ee45d722a41f902cda3244cafd75830575a2231797a96288
|
|
| MD5 |
fa61fa73b69dcc0e7ea1787bb1d335c0
|
|
| BLAKE2b-256 |
8048dfb44615f8977de4367353bc0d0eb4c4063a62d1a36d6f44bc8efc2789c2
|
Provenance
The following attestation bundles were made for uhh-0.1.1-py3-none-any.whl:
Publisher:
publish.yml on mattintech/uhh
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
uhh-0.1.1-py3-none-any.whl -
Subject digest:
b4458a1f50f85501ee45d722a41f902cda3244cafd75830575a2231797a96288 - Sigstore transparency entry: 1384405146
- Sigstore integration time:
-
Permalink:
mattintech/uhh@e43ec850bb588f969d52570bee49bf7bc0332363 -
Branch / Tag:
refs/tags/v0.1.1 - Owner: https://github.com/mattintech
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@e43ec850bb588f969d52570bee49bf7bc0332363 -
Trigger Event:
release
-
Statement type: