Skip to main content

Chat with Wikipedia

Project description

RAG-demo

Chat with (a small portion of) Wikipedia

⚠️ RAG functionality is still under development. ⚠️

app screenshot

Requirements

  1. The uv Python package manager
    • Installing and updating uv is easy by following the docs.
    • As of 2026-01-25, I'm developing using uv version 0.9.26, and using the new experimental --pytorch-backend option.
  2. A terminal emulator or web browser

Notes on terminal emulators

Certain terminal emulators will not work with some features of this program. In particular, on macOS consider using iTerm2 instead of the default Terminal.app (explanation). On Linux you might want to try kitty, wezterm, alacritty, or ghostty, instead of the terminal that came with your desktop environment (reason). Windows Terminal should be fine as far as I know.

Optional dependencies

  1. Hugging Face login
  2. API key for your favorite LLM provider (support coming soon)
  3. Ollama installed on your system if you have a GPU
  4. Run RAG-demo on a more capable (bigger GPU) machine over SSH if you can. It is a terminal app after all.
  5. A C compiler if you want to build Llama.cpp from source.

Run the latest version

Run in a terminal:

uvx --torch-backend=auto --from=jehoctor-rag-demo@latest chat

Or run in a web browser:

uvx --torch-backend=auto --from=jehoctor-rag-demo@latest textual serve chat

CUDA acceleration via Llama.cpp

If you have an NVIDIA GPU with CUDA and build tools installed, you might be able to get CUDA acceleration without installing Ollama.

CMAKE_ARGS="-DGGML_CUDA=on" uv run --extra=llamacpp chat

Metal acceleration via Llama.cpp (on Apple Silicon)

On an Apple Silicon machine, make sure uv runs an ARM interpreter as this should cause it to install Llama.cpp with Metal support. Also, run with the extra group llamacpp. Try this:

uvx --python-platform=aarch64-apple-darwin --torch-backend=auto --from='jehoctor-rag-demo[llamacpp]@latest' chat

Ollama on Linux

Remember that you have to keep Ollama up-to-date manually on Linux. A recent version of Ollama (v0.11.10 or later) is required to run the embedding model we use. See this FAQ: https://docs.ollama.com/faq#how-can-i-upgrade-ollama.

Project feature roadmap

  • ❌ RAG functionality
  • ✅ torch inference via the Langchain local Hugging Face inference integration
  • ✅ uv automatic torch backend selection (see the docs)
  • ❌ OpenAI integration
  • ❌ Anthropic integration

Run from the repository

First, clone this repository. Then, run one of the options below.

Run in a terminal:

uv run chat

Or run in a web browser:

uv run textual serve chat

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jehoctor_rag_demo-0.2.4.tar.gz (20.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

jehoctor_rag_demo-0.2.4-py3-none-any.whl (29.3 kB view details)

Uploaded Python 3

File details

Details for the file jehoctor_rag_demo-0.2.4.tar.gz.

File metadata

  • Download URL: jehoctor_rag_demo-0.2.4.tar.gz
  • Upload date:
  • Size: 20.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.0 {"installer":{"name":"uv","version":"0.10.0","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for jehoctor_rag_demo-0.2.4.tar.gz
Algorithm Hash digest
SHA256 09544dd41edf1f86846840385951cf84b686af30236aff9f61d746e52a7b34bd
MD5 5375bd82a39773abd11c1d9e9f8a072e
BLAKE2b-256 e1365700e693e14487e06b3b495acd6754586938a1bb88eb404927ce469edc52

See more details on using hashes here.

File details

Details for the file jehoctor_rag_demo-0.2.4-py3-none-any.whl.

File metadata

  • Download URL: jehoctor_rag_demo-0.2.4-py3-none-any.whl
  • Upload date:
  • Size: 29.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.10.0 {"installer":{"name":"uv","version":"0.10.0","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for jehoctor_rag_demo-0.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 d7863ec9052623e71b55eeaa394ee722722c65c863034a6ba7742f1235ae38c8
MD5 f1be4635f9e813ecd34acfdb709be121
BLAKE2b-256 33df8757ee732a10f2d549b2d6faf6240f66327d14677e7848ac759b4f71b716

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page