Skip to main content

Web UI to install and manage AMD ROCm + local AI tools on WSL2 (with optional CLI/TUI)

Project description

🔥 ROCm-WSL-AI Web UI

License Platform GPU PyTorch Shell UI Status

Make your AMD GPU sing inside WSL2. Use one Python CLI (rocmwsl, alias rocm-wsl-ai) and an optional Textual TUI to install, launch, update, and remove local AI tools — always ready for the latest PyTorch Nightly.

What you get

  • Always latest ROCm (from AMD’s “latest” apt repo) + PyTorch Nightly matched to your installed ROCm series
  • A modern, keyboard-driven TUI (Textual) with clear categories
  • One place to install, start, update, and remove local AI tools (image gen + LLMs)
  • Optional no-chmod Python CLI: install and run everything with a single command

Tools included (by category)

Image generation

  • ComfyUI
  • SD.Next
  • Automatic1111 WebUI
  • InvokeAI
  • Fooocus
  • SD WebUI Forge

LLMs

  • Ollama (with a small model manager script)
  • Text Generation WebUI

ROCm‑WSL‑AI Web UI

A modern, lightweight web interface to install, run, and manage local AI tools on WSL2 with AMD ROCm. It wraps the existing project features into a single browser-based control panel with live logs, jobs, and per‑tool settings.

Highlights

  • One web dashboard for popular tools (ComfyUI, A1111/SD.Next/Forge, Fooocus, InvokeAI, SillyTavern, TextGen, llama.cpp, KoboldCpp, FastChat, Ollama)
  • Start/stop, status, and interface links per tool
  • Live logs via SSE or WebSocket with filter and colorized streams
  • Job history with progress for installers and long‑running tasks
  • Models: location overview, index, refresh, link, and curated preset downloads
  • Wizard to set up base folders/venv and defaults for tool flags
  • Per‑tool settings persist (URL, extra args), plus smart Host/Port helpers
  • Clean, responsive UI (PicoCSS), with theme toggle and small toasts/dialogs

Quick start

  1. Install inside WSL2 (recommended)

PyPI (preferred):

python3 -m venv ~/.venvs/rocmwsl
source ~/.venvs/rocmwsl/bin/activate
python -m pip install --upgrade pip
pip install rocm-wsl-ai

PyPI package: https://pypi.org/project/rocm-wsl-ai/

From source (alternative, if you cloned this repo on Windows):

cd /mnt/f/Coding/rocm-wsl-ai   # adjust path to your repo inside WSL
python3 -m venv ~/.venvs/rocmwsl
source ~/.venvs/rocmwsl/bin/activate
python -m pip install --upgrade pip
pip install -e .

From Windows PowerShell into WSL using your local checkout:

wsl -e bash -lc "cd /mnt/f/Coding/rocm-wsl-ai && python3 -m venv ~/.venvs/rocmwsl && source ~/.venvs/rocmwsl/bin/activate && python -m pip install --upgrade pip && pip install -e ."
  1. Run the Web UI inside WSL:
export ROCMWSL_WEB_TOKEN="set-a-strong-token"   # optional but recommended on LAN
rocmwsl-web                                     # serves on 0.0.0.0:8000

From Windows, open http://localhost:8000 in your browser (WSL2 forwards localhost).

You can also launch it from PowerShell directly into WSL:

wsl -e bash -lc "export ROCMWSL_WEB_TOKEN='set-a-strong-token'; rocmwsl-web"

Tip: Change the port if needed: run(host='0.0.0.0', port=XXXX). Example from within WSL:

python -c "from rocm_wsl_ai.web.app import run; run(host='0.0.0.0', port=9000)"

Using the Web UI

Dashboard

  • Cards show each tool’s status (running/stopped), PID, and actions.
  • Click Install to run the installer as a background job.
  • Start launches the tool (background when supported). Stop ends it.
  • Logs opens a live log stream (switch between SSE/WS). Use the filter box for regex filtering; stderr/stdout are color‑coded.

Tools page

  • Per‑tool settings:
    • Interface URL (used for the “Open interface” link in cards)
    • Host & Port helpers that auto‑compose common flags (e.g., --listen/--port)
    • Extra Args to pass on start (stored and reused)
  • The UI keeps Host/Port and URL in sync for convenience.

Models page

  • See where models are located for different categories.
  • Build and refresh a searchable models index.
  • Link your models into supported tools folders.
  • Download preset model bundles (curated). Tasks run as jobs with progress.

Wizard

  • Configure base directory, venv, and optional defaults for tool flags (host/port/flags).
  • Saves defaults into a tools.json so starts can reuse them.

Help

  • Quick tips and troubleshooting pointers integrated into the UI.

Updates

To update the package:

pip install -U rocm-wsl-ai

Your settings (tools.json), job history (jobs.json), and config live in the project’s config directory and are preserved across updates.


Security

  • Optional token-based auth: set an environment variable before you start the server.
$env:ROCMWSL_WEB_TOKEN = "your-long-random-token"
rocmwsl-web
  • With a token set, the UI redirects to a small login where you paste the token. APIs also accept the token via cookie, x-auth header, or token query parameter.
  • If you expose the server on your LAN (host 0.0.0.0), use a token. For public networks, prefer a proper reverse proxy and TLS.

FAQs / Troubleshooting

  • The tool doesn’t start or shows “stopped” quickly.
    • Open Logs to see errors in real‑time. Check Extra Args on Tools page. Verify the tool repository and dependencies are installed.
  • Interface link opens the wrong port.
    • Edit the URL in the tool’s settings. Host/Port helpers can auto‑compose flags; the UI syncs URL and Host/Port.
  • SillyTavern install requires Node.
    • The installer attempts to guide via nvm. If Node isn’t present, the job runs nvm install/use LTS and npm install in the SillyTavern folder.
  • Where are my settings stored?
    • tools.json and jobs.json are saved next to your main config.toml (see Models page → Where for base folder hints).

Uninstall

pip uninstall rocm-wsl-ai

License

MIT


Releasing (maintainers)

PyPI:

python -m pip install --upgrade build twine
python -m build
twine check dist/*
twine upload dist/*

GitHub:

  • Tag the release (e.g., v0.2.0) and push tags
  • Create a GitHub Release with notes and attach wheels/sdist if desired

After PyPI release, you can simplify README install instructions to a single line:

pip install rocm-wsl-ai

The TUI (optional)

rocmwsl menu   # launches the Textual TUI

Use arrow keys/Enter. Install “base” first, then pick your tools. Launch and update from the TUI or via CLI.

Typical first run

  1. Wizard: schnelle Ersteinrichtung (Base + optional ComfyUI)
rocmwsl wizard

oder manuell:

  1. Installation → Base (ROCm & PyTorch Nightly)
  2. Restart WSL if asked
  3. Installation → Pick your tools (e.g., ComfyUI, A1111, Ollama)
  4. Launch → Start your tools

CLI equivalent

rocmwsl wizard --base-dir "$HOME/AI" --venv-name genai_env
# oder manuell
rocmwsl install base
rocmwsl install comfyui
rocmwsl start comfyui

Upgrading

Upgrades:

  • Update everything: rocmwsl update all
  • Update a single tool (e.g., ComfyUI): rocmwsl update comfyui
  • Update base (PyTorch Nightly): rocmwsl update base
  • Self-update (CLI/TUI):
    • If installed via pipx: pipx upgrade rocm-wsl-ai or pipx upgrade rocmwsl
    • From CLI: rocmwsl update self

Useful tips

  • Diagnose: rocmwsl doctor (prüft /dev/kfd, rocm-smi/rocminfo, Torch/HIP im venv)
  • Konfiguration: ~/.config/rocm-wsl-ai/config.toml
     [paths]
     base_dir = "/home/<user>/AI"
    
     [python]
     venv_name = "genai_env"
    
  • If the TUI looks very plain, install whiptail (see Requirements)
  • If you changed groups during base install: restart WSL (wsl --shutdown from Windows)
  • Ollama’s systemd user service may require systemd in WSL; if it doesn’t start, run it manually via the scripts
  • For ROCm trouble, use the menu’s Driver Management and follow the prompts

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rocm_wsl_ai-0.2.8.tar.gz (51.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

rocm_wsl_ai-0.2.8-py3-none-any.whl (61.8 kB view details)

Uploaded Python 3

File details

Details for the file rocm_wsl_ai-0.2.8.tar.gz.

File metadata

  • Download URL: rocm_wsl_ai-0.2.8.tar.gz
  • Upload date:
  • Size: 51.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for rocm_wsl_ai-0.2.8.tar.gz
Algorithm Hash digest
SHA256 c604b6d7f3f4ee8b999f4a6954b33874bf8468d07272ecc961105800b559b33b
MD5 e5337aa1fa8c30a4c7378f81ea649853
BLAKE2b-256 7cb30e5c41b39ba126d2d1ba36a0b4b919c2e8e48a92668f294aff5681eae190

See more details on using hashes here.

File details

Details for the file rocm_wsl_ai-0.2.8-py3-none-any.whl.

File metadata

  • Download URL: rocm_wsl_ai-0.2.8-py3-none-any.whl
  • Upload date:
  • Size: 61.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.9

File hashes

Hashes for rocm_wsl_ai-0.2.8-py3-none-any.whl
Algorithm Hash digest
SHA256 e4e8b96759ef0a41cd9063bf8b5a9cc592dae0f39211583161a4ba9e4d49e7d9
MD5 83875293456e388464c2c5092c93adcf
BLAKE2b-256 4ab0814c912d0d1c9e8b453297bed4fe454cf95b01b5b1c239f9093ac7632880

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page