Skip to main content

DIY Tesla in-car voice concierge: FastMCP server + HA configs + browser PWA. Climate, media, windows, seats.

Project description

hey-nabu-climate-concierge

PyPI CI

A DIY in-car voice concierge that closes Tesla Grok's biggest gap — climate, media, window, and seat control. Single utterance, multi-target. Built on Home Assistant Voice + microWakeWord + an MCP server wrapping tesla-fleet-api.

Install

pip install hey-nabu-climate-concierge              # MCP server only
pip install 'hey-nabu-climate-concierge[live]'      # +tesla-fleet-api for real commands
pip install 'hey-nabu-climate-concierge[dev]'       # +pytest, ruff for contributing

Or with uv:

uv add hey-nabu-climate-concierge
uv add 'hey-nabu-climate-concierge[live]'

The PyPI wheel ships only the MCP server (hey_nabu_mcp). The browser PWA in pwa/ is a separate static-file artifact you host yourself and load in the Tesla browser — see pwa/README.md. The Home Assistant YAML in ha-config/ and the system prompt in prompts/system.md are copy/paste artifacts, not part of the wheel.

After install:

hey-nabu --help              # CLI surface
hey-nabu health              # smoke-test imports + print version
hey-nabu tools               # list registered MCP tools

"Hey Nabu, make it warm and play the focus playlist."

→ microWakeWord on phone fires → HA Assist pipeline → Ollama (Qwen3-32B) sees the intent → calls 3 MCP tools concurrently (set_climate, play_spotify, optional vent_windows) → MCP server dispatches signed Tesla commands via the HTTP Proxy → waits for Fleet API confirmation events before claiming success.

The full architecture spec is in ../ideas/v2/wave3/hey_nabu_climate_concierge.md. This repo is the buildable scaffold of that spec.

Why this exists (vs Grok)

Tesla Grok in firmware 2026.14 cannot:

  • Adjust climate, media, windows, frunk, or seats
  • Operate offline (requires Premium Connectivity)
  • Control your smart home
  • See your calendar, email, or RAG over your documents
  • Survive a cellular drop without restarting the session
  • Multi-step / agentic loops

Hey Nabu does all of that. The wave-3 spec writes up the competitive matrix in detail.

Status — v0.4 alpha

What works today (testable without a Tesla):

  • Pure-Python logic module with one function per voice intent (set_climate, play_spotify, unlock_doors, vent_windows, get_climate_status). Each function takes a TeslaClient and a ConfirmationWaiter as arguments — fully DI'd, mock-friendly.
  • MockTeslaClient records all calls and returns canned responses.
  • FakeConfirmationWaiter simulates the HA event-bus confirmation pattern with controllable timeout/success behavior.
  • FastMCP server.py wraps the logic functions as MCP tools (@mcp.tool()) so Ollama/Claude/any MCP client can call them. Exposes 5 tools.
  • hey-nabu serve Typer CLI starts the MCP server (defaults to SSE transport on 0.0.0.0:8765).
  • HAWebSocketConfirmationWaiter real implementation of the wait-for-tesla_fleet_command_confirmed event listener via the HA WebSocket API.
  • LiveTeslaClient wraps tesla-fleet-api (only loaded when you pip install '...[live]').
  • Full HA YAML configs in ha-config/ ready to paste into your HA install — configuration.yaml, custom_sentences/en/climate.yaml, intent_script.yaml, scripts.yaml, automations.yaml.
  • System prompt at prompts/system.md includes the hallucination-defense clause ("If a tool returns pending=true, say 'queued' not 'done'") that turns the confirmation pattern into a structural property.
  • Demo script at demo/script.md with three timed utterances for filming.
  • Sharp-edges doc at docs/sharp-edges.md with the 10 known footguns from the wave-3 spec.

Browser side (pwa/, 44/44 Node tests pass — added in v0.2, expanded in v0.4):

  • AgentRouter state machine (cloud → local → error) with injectable transports — pure logic, tested in Node with fakes + a manual clock

  • HAWebSocketTransport wraps the HA conversation/process command via long-lived access token

  • WebLlmTransport lazy-loads @mlc-ai/web-llm (Phi-3.5-mini-instruct, ~2.2 GB to IndexedDB) on first cellular-drop fallback

  • PWA shell: index.html + styles.css + manifest.webmanifest + service-worker.js + app.js. Tesla-touchscreen-friendly dark UI with mode pill + status cards + conversation log + settings dialog

  • Service worker caches the shell offline-first; never caches cross-origin requests

  • Reconnect loop: when in local mode, pings HA every 10 s and swaps back to cloud automatically

  • v0.4: Conversation history persistence. Every user + assistant turn (including scripted demo-mode turns, flagged with "demo": true) is appended to a local IndexedDB store (hey-nabu-history, schema v1, single object store turns keyed by autoincrement id, indexed on timestamp). The header gains a History button that opens a dialog with two tabs:

    • View — the 200 most-recent turns, newest-first.
    • ExportDownload as JSONL (mime application/x-ndjson, filename hey-nabu-history-YYYY-MM-DD.jsonl) and Clear history (with confirm).

    One JSONL line looks like:

    {"role":"user","text":"set the cabin to 72","timestamp":1763510400000,"id":1,"demo":true}
    

    All store functions in pwa/history-store.js accept an injectable IDB factory so the unit tests run against a tiny in-memory shim — no fake-indexeddb dependency.

  • See pwa/README.md for dev workflow and Tesla-deployment instructions

What's not yet in v0.4 (deferred):

  • The Hetzner $5/mo Docker Compose HA mirror (operational pattern, documented in docs/sharp-edges.md not yet shipped as code)
  • The cabin-radar passenger inference (HA already exposes; a separate intent)
  • Real HA event-bus state subscription for the PWA status cards (currently best-effort polled)

Quick start (without a Tesla, just the dev loop)

git clone https://github.com/Raymondriter/hey-nabu-climate-concierge.git
cd hey-nabu-climate-concierge
uv sync --extra dev
uv run pytest                           # ~15 tests, no creds, no Tesla
uv run hey-nabu --help                  # CLI surface

Quick start (full deployment)

  1. Install HA Voice + Ollama per docs/architecture.md. HA OS or HA Container; ollama pull qwen3:32b-q5_k_m on a home box.
  2. Wire the YAML — paste the snippets from ha-config/ into your HA /config/. Restart HA.
  3. Set up Android Companion app with microWakeWord and "Hey Nabu" wake phrase per ha-config/companion_app_setup.md.
  4. Set the env vars in .env (Tesla developer client_id/secret, your VIN, Tesla refresh token, Spotify client_id/secret, HA long-lived token).
  5. Start the MCP server:
    uv sync --extra live --extra dev
    uv run hey-nabu serve --host 0.0.0.0 --port 8765
    
  6. Wire HA's MCP client to http://localhost:8765/sse. Settings → Devices & services → MCP.
  7. Test: in your car (or anywhere with the Companion app paired to its Bluetooth speaker), say "Hey Nabu, make it warm and play the focus playlist."

Architecture

┌─────────────────────────┐         ┌──────────────────────────┐
│  Tesla browser PWA      │         │  Home Assistant          │
│  (visual surface only)  │         │   ├ Wyoming + Moonshine  │
└──────────┬──────────────┘         │   ├ Wyoming + Kokoro     │
           │                        │   ├ Ollama (Qwen3-32B)   │
           │ status / cards         │   ├ MCP client → us       │
           ▼                        │   ├ tesla_fleet integ.   │
       Touchscreen                  │   └ event bus            │
                                    └────────┬─────────────────┘
                                             │
   ┌────────────────────────┐                │
   │ Phone (Android         │                │
   │ Companion app)         │                │
   │  ├ microWakeWord       │  Wyoming TCP   │
   │  ├ Bluetooth A2DP →    │ ──────────────▶│
   │  │   car speakers      │                │
   │  └ mic capture         │                │
   └────────────────────────┘                │
                                             │
                                  MCP/SSE    ▼
                              ┌─────────────────────────┐
                              │  hey-nabu-mcp           │
                              │  (this repo)            │
                              │   ├ FastMCP @mcp.tool() │
                              │   ├ logic.* funcs (DI)  │
                              │   ├ ConfirmationWaiter  │ ← waits for
                              │   └ TeslaClient        │   tesla_fleet_
                              └─────────┬───────────────┘   command_
                                        │                   confirmed
                                        │ signed cmd via HTTP Proxy
                                        ▼
                              ┌─────────────────────────┐
                              │  Tesla HTTP Proxy v0.4.1│
                              └─────────┬───────────────┘
                                        │ BLE / Fleet API
                                        ▼
                                       Tesla

Screenshots

The PWA rendered in a 1280x900 viewport (captured headlessly against the bundled python -m http.server dev server). The Conversation history dialog is visible in all three because headless Chrome promotes a closed <dialog> styled with display: flex to the top layer; in a real Tesla browser session it stays hidden until you tap the History button in the header.

PWA idle state — header, mode pill, vehicle status cards, empty conversation log, settings prompt

Demo mode at ~4 s — the scripted exchange has fired the climate set_temperature turn

Demo mode at ~12 s — climate, vent_windows, and the play-Spotify turn have all played through

Changelog

See CHANGELOG.md. Versions follow Keep a Changelog and the project uses SemVer. For the release runbook, see RELEASE.md.

License

MIT. See LICENSE.

CI

ci

GitHub Actions runs two parallel jobs on every push and PR:

  • python — ruff + pytest on Python 3.12 and 3.13 for the MCP server.
  • pwanode --check + the node 24 built-in test runner against pwa/tests/**/*.test.js.

See .github/workflows/ci.yml.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hey_nabu_climate_concierge-0.4.0.tar.gz (22.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hey_nabu_climate_concierge-0.4.0-py3-none-any.whl (17.1 kB view details)

Uploaded Python 3

File details

Details for the file hey_nabu_climate_concierge-0.4.0.tar.gz.

File metadata

  • Download URL: hey_nabu_climate_concierge-0.4.0.tar.gz
  • Upload date:
  • Size: 22.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.12 {"installer":{"name":"uv","version":"0.11.12","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":null,"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for hey_nabu_climate_concierge-0.4.0.tar.gz
Algorithm Hash digest
SHA256 221237a345742f6d5af1858718d424e71ebd977bb6c45a4f8600be69c4e9d758
MD5 c11d12645fa7d4aaf24ad4e3b1c6adaa
BLAKE2b-256 3f8ab5cdfc8c300b8c199372eaf0c8f077e4e600a5aaf21e31b538d32b986c60

See more details on using hashes here.

File details

Details for the file hey_nabu_climate_concierge-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: hey_nabu_climate_concierge-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 17.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.11.12 {"installer":{"name":"uv","version":"0.11.12","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":null,"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for hey_nabu_climate_concierge-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ea2d1f9b2c626ad70980df36348f2d7a73d70d3986410d4976f9dc71618bc56f
MD5 27e5bd3b17692bd72d215de0ce55dbb1
BLAKE2b-256 79c6b4bda2e47b69cf1dcaeb618036b9c47ce7bbbe8587d26fc2761e5bf10031

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page