Skip to main content

Web UI to install and manage AMD ROCm + local AI tools on WSL2 (with optional CLI/TUI)

Project description

🔥 ROCm-WSL-AI Web UI

License Platform GPU PyTorch Shell UI Status

Make your AMD GPU sing inside WSL2. Use one Python CLI (rocmwsl, alias rocm-wsl-ai) and an optional Textual TUI to install, launch, update, and remove local AI tools — always ready for the latest PyTorch Nightly.

What you get

  • Always latest ROCm (from AMD’s “latest” apt repo) + PyTorch Nightly matched to your installed ROCm series
  • A modern, keyboard-driven TUI (Textual) with clear categories
  • One place to install, start, update, and remove local AI tools (image gen + LLMs)
  • Optional no-chmod Python CLI: install and run everything with a single command

Tools included (by category)

Image generation

  • ComfyUI
  • SD.Next
  • Automatic1111 WebUI
  • InvokeAI
  • Fooocus
  • SD WebUI Forge

LLMs

  • Ollama (with a small model manager script)
  • Text Generation WebUI

ROCm‑WSL‑AI Web UI

A modern, lightweight web interface to install, run, and manage local AI tools on WSL2 with AMD ROCm. It wraps the existing project features into a single browser-based control panel with live logs, jobs, and per‑tool settings.

Highlights

  • One web dashboard for popular tools (ComfyUI, A1111/SD.Next/Forge, Fooocus, InvokeAI, SillyTavern, TextGen, llama.cpp, KoboldCpp, FastChat, Ollama)
  • Start/stop, status, and interface links per tool
  • Live logs via SSE or WebSocket with filter and colorized streams
  • Job history with progress for installers and long‑running tasks
  • Models: location overview, index, refresh, link, and curated preset downloads
  • Wizard to set up base folders/venv and defaults for tool flags
  • Per‑tool settings persist (URL, extra args), plus smart Host/Port helpers
  • Clean, responsive UI (PicoCSS), with theme toggle and small toasts/dialogs

Quick start

  1. Install inside WSL2 (recommended)

Open your WSL distro (e.g., Ubuntu) and run:

python3 -m venv ~/.venvs/rocmwsl
source ~/.venvs/rocmwsl/bin/activate
python -m pip install --upgrade pip
pip install rocm-wsl-ai

If pip cannot find the package (not published on PyPI yet), install from source (this repo) or from GitHub:

From source (recommended if you already cloned this repo on Windows):

cd /mnt/f/Coding/rocm-wsl-ai   # adjust path to your repo inside WSL
python3 -m venv ~/.venvs/rocmwsl
source ~/.venvs/rocmwsl/bin/activate
python -m pip install --upgrade pip
pip install -e .

From Windows PowerShell into WSL using your local checkout:

wsl -e bash -lc "cd /mnt/f/Coding/rocm-wsl-ai && python3 -m venv ~/.venvs/rocmwsl && source ~/.venvs/rocmwsl/bin/activate && python -m pip install --upgrade pip && pip install -e ."

Or directly from GitHub (if the repo is public):

pip install "git+https://github.com/daMustermann/rocm-wsl-ai.git@main"

Alternatively from Windows PowerShell you can run everything inside your default WSL distro (using local checkout or GitHub install as above):

wsl -e bash -lc "cd /mnt/f/Coding/rocm-wsl-ai && python3 -m venv ~/.venvs/rocmwsl && source ~/.venvs/rocmwsl/bin/activate && python -m pip install --upgrade pip && pip install -e ."
  1. Run the Web UI inside WSL:
export ROCMWSL_WEB_TOKEN="set-a-strong-token"   # optional but recommended on LAN
rocmwsl-web                                     # serves on 0.0.0.0:8000

From Windows, open http://localhost:8000 in your browser (WSL2 forwards localhost).

You can also launch it from PowerShell directly into WSL:

wsl -e bash -lc "export ROCMWSL_WEB_TOKEN='set-a-strong-token'; rocmwsl-web"

Tip: Change the port if needed: run(host='0.0.0.0', port=XXXX). Example from within WSL:

python -c "from rocm_wsl_ai.web.app import run; run(host='0.0.0.0', port=9000)"

Using the Web UI

Dashboard

  • Cards show each tool’s status (running/stopped), PID, and actions.
  • Click Install to run the installer as a background job.
  • Start launches the tool (background when supported). Stop ends it.
  • Logs opens a live log stream (switch between SSE/WS). Use the filter box for regex filtering; stderr/stdout are color‑coded.

Tools page

  • Per‑tool settings:
    • Interface URL (used for the “Open interface” link in cards)
    • Host & Port helpers that auto‑compose common flags (e.g., --listen/--port)
    • Extra Args to pass on start (stored and reused)
  • The UI keeps Host/Port and URL in sync for convenience.

Models page

  • See where models are located for different categories.
  • Build and refresh a searchable models index.
  • Link your models into supported tools folders.
  • Download preset model bundles (curated). Tasks run as jobs with progress.

Wizard

  • Configure base directory, venv, and optional defaults for tool flags (host/port/flags).
  • Saves defaults into a tools.json so starts can reuse them.

Help

  • Quick tips and troubleshooting pointers integrated into the UI.

Updates

To update the package:

pip install -U rocm-wsl-ai

Your settings (tools.json), job history (jobs.json), and config live in the project’s config directory and are preserved across updates.


Security

  • Optional token-based auth: set an environment variable before you start the server.
$env:ROCMWSL_WEB_TOKEN = "your-long-random-token"
rocmwsl-web
  • With a token set, the UI redirects to a small login where you paste the token. APIs also accept the token via cookie, x-auth header, or token query parameter.
  • If you expose the server on your LAN (host 0.0.0.0), use a token. For public networks, prefer a proper reverse proxy and TLS.

FAQs / Troubleshooting

  • The tool doesn’t start or shows “stopped” quickly.
    • Open Logs to see errors in real‑time. Check Extra Args on Tools page. Verify the tool repository and dependencies are installed.
  • Interface link opens the wrong port.
    • Edit the URL in the tool’s settings. Host/Port helpers can auto‑compose flags; the UI syncs URL and Host/Port.
  • SillyTavern install requires Node.
    • The installer attempts to guide via nvm. If Node isn’t present, the job runs nvm install/use LTS and npm install in the SillyTavern folder.
  • Where are my settings stored?
    • tools.json and jobs.json are saved next to your main config.toml (see Models page → Where for base folder hints).

Uninstall

pip uninstall rocm-wsl-ai

License

MIT


Releasing (maintainers)

PyPI:

python -m pip install --upgrade build twine
python -m build
twine check dist/*
twine upload dist/*

GitHub:

  • Tag the release (e.g., v0.2.0) and push tags
  • Create a GitHub Release with notes and attach wheels/sdist if desired

After PyPI release, you can simplify README install instructions to a single line:

pip install rocm-wsl-ai
wsl --shutdown

The TUI (optional)

rocmwsl menu   # launches the Textual TUI

Use arrow keys/Enter. Install “base” first, then pick your tools. Launch and update from the TUI or via CLI.

Typical first run

  1. Wizard: schnelle Ersteinrichtung (Base + optional ComfyUI)
rocmwsl wizard

oder manuell:

  1. Installation → Base (ROCm & PyTorch Nightly)
  2. Restart WSL if asked
  3. Installation → Pick your tools (e.g., ComfyUI, A1111, Ollama)
  4. Launch → Start your tools

CLI equivalent

rocmwsl wizard --base-dir "$HOME/AI" --venv-name genai_env
# oder manuell
rocmwsl install base
rocmwsl install comfyui
rocmwsl start comfyui

Upgrading

Upgrades:

  • Update everything: rocmwsl update all
  • Update a single tool (e.g., ComfyUI): rocmwsl update comfyui
  • Update base (PyTorch Nightly): rocmwsl update base
  • Self-update (CLI/TUI):
    • If installed via pipx: pipx upgrade rocm-wsl-ai or pipx upgrade rocmwsl
    • From CLI: rocmwsl update self

Useful tips

  • Diagnose: rocmwsl doctor (prüft /dev/kfd, rocm-smi/rocminfo, Torch/HIP im venv)
  • Konfiguration: ~/.config/rocm-wsl-ai/config.toml
     [paths]
     base_dir = "/home/<user>/AI"
    
     [python]
     venv_name = "genai_env"
    
  • If the TUI looks very plain, install whiptail (see Requirements)
  • If you changed groups during base install: restart WSL (wsl --shutdown from Windows)
  • Ollama’s systemd user service may require systemd in WSL; if it doesn’t start, run it manually via the scripts
  • For ROCm trouble, use the menu’s Driver Management and follow the prompts

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rocm_wsl_ai-0.2.1.tar.gz (50.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

rocm_wsl_ai-0.2.1-py3-none-any.whl (59.1 kB view details)

Uploaded Python 3

File details

Details for the file rocm_wsl_ai-0.2.1.tar.gz.

File metadata

  • Download URL: rocm_wsl_ai-0.2.1.tar.gz
  • Upload date:
  • Size: 50.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for rocm_wsl_ai-0.2.1.tar.gz
Algorithm Hash digest
SHA256 21e74f05ad10ff440855becadf2b5ce8efee37f46e756fc7098882cf52ffc705
MD5 7b40064c7fb0e7a6369478ee0251dcb0
BLAKE2b-256 4d42b224cfeb47eec0eecfe4ac3be210d43426f6360516fc92dbfb3f92e5380f

See more details on using hashes here.

File details

Details for the file rocm_wsl_ai-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: rocm_wsl_ai-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 59.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for rocm_wsl_ai-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 98649e309578e39c0825e1f97d4d1252269f4faf520c74edf2bbb3ba340d09fd
MD5 96e1f7e6a0d69be8ce7a758736e97de7
BLAKE2b-256 946f0bed18291e94e9147118947010d662fd93dee5c9eb703d6a444c3a6888a4

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page