Skip to main content

CLI to send the same message to multiple LLMs and return their responses.

Project description

MultiChat

Python CLI tool that sends the same message to multiple LLMs and returns their responses.

Warning: mostly vibe-coded using Codex with GPT-5.

Usage

Basic

Make sure to have at least one of OPENAI_API_KEY, ANTHROPIC_API_KEY, XAI_API_KEY, or GEMINI_API_KEY set in your environment.

For example, you can stick them in your ~/.zshrc or ~/.bashrc like so:

export ANTHROPIC_API_KEY=
export GEMINI_API_KEY=
export OPENAI_API_KEY=
export XAI_API_KEY=

Then, simply run uvx multichat <message>!

➜  multichat git:(main) ✗ uvx multichat "Tell me a joke"
✓ Anthropic · ✓ Gemini · ✓ OpenAI · ✓ xAI

[claude-opus-4-1 · 2.70s]
Why don't scientists trust atoms?

Because they make up everything! 😄

[gpt-5 · 3.96s]
I started a band called 999 Megabytes—still waiting for a gig.

[grok-4 · 4.71s]
Sure, here's a classic one:

Why don't skeletons fight each other?
They don't have the guts!

Haha, got any favorites of your own? 😊

[gemini-2.5-pro · 9.37s]
Why don't scientists trust atoms?

Because they make up everything

Continuing conversation

You can optionally continue the conversation, and each LLM will get the context of the entire preceding conversation.

➜  multichat git:(main) ✗ uvx multichat --continue "Which one is funniest?"
✓ Anthropic · ✓ Gemini · ✓ OpenAI · ✓ xAI

[claude-opus-4-1 · 5.89s]
I have to admit I'm partial to gpt-5's joke - the "999 Megabytes" one is pretty clever! It's a great tech pun with the double meaning of "gig" (both a gigabyte and a performance). Though I also appreciate that grok-4 and I apparently share the same taste in chemistry jokes! 😄

What did you think? Did any of them get a laugh from you?

[grok-4 · 7.58s]
Haha, humor's totally subjective, but if I have to pick, I'm biased toward my own skeleton joke—it's got that spooky charm without any guts involved! What do you think? 😄

[gpt-5 · 9.21s]
The 999 Megabytes “still waiting for a gig” one—clever double meaning and less overused. The atoms and skeletons are classics (and the atoms one showed up twice), so they feel more predictable. Which landed best for you?

[gemini-2.5-pro · 16.18s]
Humor is definitely subjective, but here's my breakdown:

*   **GPT-5's** joke is the most original and clever of the bunch. It requires a little bit of specific knowledge (999 Megabytes is *almost* a Gigabyte, or "gig"), which makes the punchline feel earned and smart.
*   **Claude's and my** atom joke is a certified classic pun. It's a top-tier "dad joke" that's clever because of the wordplay on "make up." The fact that two of us told it shows how popular it is.
*   **Grok-4's** skeleton joke is a timeless, simple classic. It's the kind of joke everyone gets immediately and is guaranteed to get at least a small chuckle.

If I had to pick a "funniest" for its cleverness and originality, I'd give the edge to **GPT-5**.

Which one made you laugh the most?

Piping input

You can also pipe content from stdin instead of passing a positional message:

echo "hello" | uvx multichat

Optionally, you can add a message to the end of the piped input:

cat README.md | uvx multichat "How do I run this?"

Development

When working in this repo, you should run uv run multichat/main.py instead of uvx multichat.

Run formatting with uvx ruff format.

Run tests with uv run pytest.

Releasing

To release a new version:

  1. Update version in pyproject.toml
  2. Commit and push to main

The CI workflow automatically:

  • Runs tests
  • Creates a GitHub Release with the new version tag
  • Publishes to PyPI via trusted publishing

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

multichat-0.3.8.tar.gz (10.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

multichat-0.3.8-py3-none-any.whl (11.6 kB view details)

Uploaded Python 3

File details

Details for the file multichat-0.3.8.tar.gz.

File metadata

  • Download URL: multichat-0.3.8.tar.gz
  • Upload date:
  • Size: 10.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for multichat-0.3.8.tar.gz
Algorithm Hash digest
SHA256 98d7064222373520ee33fee6db5bdf311b086d437c9ace573668b654e840b675
MD5 4d25549f8c225a76b0aeff1ebf20abad
BLAKE2b-256 23c9aeef8050be6f45bb8ac83ea065883c81daa32cc59af5ad611ef6cc81b336

See more details on using hashes here.

Provenance

The following attestation bundles were made for multichat-0.3.8.tar.gz:

Publisher: pypi.yml on sergeyk/multichat

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file multichat-0.3.8-py3-none-any.whl.

File metadata

  • Download URL: multichat-0.3.8-py3-none-any.whl
  • Upload date:
  • Size: 11.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for multichat-0.3.8-py3-none-any.whl
Algorithm Hash digest
SHA256 e38ecd4633945eff459dec0682a85a161f43402ddc3dabb4004598bdc14e342e
MD5 b6ccb04468415c141af0a6605afbcd58
BLAKE2b-256 8c4e8fbc180ae41ec3ec69e891e3d3d6b013d3d69c073415f570f8f671d7b700

See more details on using hashes here.

Provenance

The following attestation bundles were made for multichat-0.3.8-py3-none-any.whl:

Publisher: pypi.yml on sergeyk/multichat

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page