Skip to main content

Run prompts against LLMs hosted by Moonshot

Project description

zopyx.llm-moonshot

LLM plugin for Moonshot AI's models

PyPI Changelog License

LLM plugin for models hosted by Moonshot AI.

Project Status

This package is maintained as the zopyx.llm-moonshot fork of the original llm-moonshot project.

The fork keeps the plugin usable in current environments and aligns packaging, CI/CD, and release workflow with the zopyx distribution. The post-fork changes and the reasons for them are documented in CHANGELOG.md.

Installation

First, install the LLM command-line utility.

Now install this plugin in the same environment as LLM:

llm install zopyx.llm-moonshot

Configuration

You'll need an API key from Moonshot. Grab one at platform.moonshot.cn.

Set secret key:

llm keys set moonshot
Enter key: <paste key here>

Usage

List what's on the menu:

llm models list

You'll see something like:

[[[cog
import cog
from llm_moonshot._models import DEFAULT_MOONSHOT_MODEL_IDS
for model_id in DEFAULT_MOONSHOT_MODEL_IDS:
    cog.outl(f"Moonshot: moonshot/{model_id}")
]]]
Moonshot: moonshot/kimi-latest
Moonshot: moonshot/moonshot-v1-auto
Moonshot: moonshot/moonshot-v1-128k-vision-preview
Moonshot: moonshot/kimi-k2-0711-preview
Moonshot: moonshot/moonshot-v1-128k
Moonshot: moonshot/moonshot-v1-32k-vision-preview
Moonshot: moonshot/moonshot-v1-8k-vision-preview
Moonshot: moonshot/moonshot-v1-8k
Moonshot: moonshot/kimi-thinking-preview
Moonshot: moonshot/moonshot-v1-32k
Moonshot: moonshot/kimi-k2-thinking
[[[end]]]

Fire up a chat:

llm chat -m moonshot/kimi-k2-0711-preview
Chatting with  moonshot/kimi-k2-0711-preview
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
> yo moonie
yo! what's up, moonie?
>

Need raw completion?

llm -m moonshot/moonshot-v1-8k "Finish this haiku: Neon city rain"
Neon city rain,
Glistening streets, a symphony,
Echoes of the night.

Reasoning Content Support

This plugin now supports reasoning content for Moonshot's thinking models (models with "thinking" in the name). When using thinking models, you'll see the model's reasoning process displayed in real-time before the final response:

llm chat -m moonshot/kimi-k2-thinking
[Reasoning] (shown in cyan dim)

The user is asking me to solve a complex problem. Let me think through this step by step...
First, I need to understand the core requirements...
Then I'll analyze the available options...

[Response] (shown in bold green)

Here's my well-reasoned answer to your question...

Available Thinking Models

  • moonshot/kimi-k2-thinking - Latest reasoning model
  • moonshot/kimi-thinking-preview - Preview reasoning model

The reasoning content helps you understand:

  • Decision-making process - See how the model analyzes problems
  • Multi-step reasoning - Follow complex thought chains
  • Error detection - Catch logical gaps or misunderstandings early

Aliases

Save your wrists:

llm aliases set kimi moonshot/kimi-latest

Now:

llm -m kimi "write a haiku about the AI chatbot Sidney is misbehaving"

Troubleshooting

Models don't appear in llm models list

  • Make sure you have set a Moonshot API key with llm keys set moonshot.
  • The plugin caches the model catalog for one hour. If the API was unreachable, it falls back to a built-in catalog.

Streaming connection dropped

  • The plugin automatically retries without streaming when Moonshot closes the connection mid-stream. This is a known upstream behavior and is handled transparently.

401 Unauthorized

Development

Clone, sync, build:

git clone https://github.com/zopyx/llm-moonshot.git
cd llm-moonshot
uv sync --extra dev
make check
make dist

To publish using .pypirc with twine:

make upload

See CONTRIBUTING.md for more details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zopyx_llm_moonshot-0.3.7.tar.gz (14.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

zopyx_llm_moonshot-0.3.7-py3-none-any.whl (11.7 kB view details)

Uploaded Python 3

File details

Details for the file zopyx_llm_moonshot-0.3.7.tar.gz.

File metadata

  • Download URL: zopyx_llm_moonshot-0.3.7.tar.gz
  • Upload date:
  • Size: 14.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for zopyx_llm_moonshot-0.3.7.tar.gz
Algorithm Hash digest
SHA256 cca8ee142c97d7a8fa82b6736aa78a3cfd57600e46fb6124adf3ed1b3b3b108a
MD5 3e9e8fdbfca3a498fa4f66f221d286ec
BLAKE2b-256 0188219c0f7e122a208ae0b26d16caa02d2b480d60d162dce31f5a06df6a8aa2

See more details on using hashes here.

Provenance

The following attestation bundles were made for zopyx_llm_moonshot-0.3.7.tar.gz:

Publisher: publish.yml on zopyx/llm-moonshot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zopyx_llm_moonshot-0.3.7-py3-none-any.whl.

File metadata

File hashes

Hashes for zopyx_llm_moonshot-0.3.7-py3-none-any.whl
Algorithm Hash digest
SHA256 0538c4b1b1d93b08b9743cf112f2bb2f1df4337336eeb9a62d9853070789b71c
MD5 dff898143cb3c3549dade82c230969ce
BLAKE2b-256 81e3e246ad5ba2f8e3f6fe1c8813b2aa8ce4d391ff390a7b477d119683933eb0

See more details on using hashes here.

Provenance

The following attestation bundles were made for zopyx_llm_moonshot-0.3.7-py3-none-any.whl:

Publisher: publish.yml on zopyx/llm-moonshot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page