Skip to main content

Run prompts against LLMs hosted by Moonshot

Project description

zopyx.llm-moonshot

LLM plugin for Moonshot AI's models

PyPI Changelog License

LLM plugin for models hosted by Moonshot AI.

Project Status

This package is maintained as the zopyx.llm-moonshot fork of the original llm-moonshot project.

The fork keeps the plugin usable in current environments and aligns packaging, CI/CD, and release workflow with the zopyx distribution. The post-fork changes and the reasons for them are documented in CHANGELOG.md.

Installation

First, install the LLM command-line utility.

Now install this plugin in the same environment as LLM:

llm install zopyx.llm-moonshot

Configuration

You'll need an API key from Moonshot. Grab one at platform.moonshot.cn.

Set secret key:

llm keys set moonshot
Enter key: <paste key here>

Usage

List what's on the menu:

llm models list

You'll see something like:

[[[cog
import cog
from llm_moonshot._models import DEFAULT_MOONSHOT_MODEL_IDS
for model_id in DEFAULT_MOONSHOT_MODEL_IDS:
    cog.outl(f"Moonshot: moonshot/{model_id}")
]]]
Moonshot: moonshot/kimi-latest
Moonshot: moonshot/moonshot-v1-auto
Moonshot: moonshot/moonshot-v1-128k-vision-preview
Moonshot: moonshot/kimi-k2-0711-preview
Moonshot: moonshot/moonshot-v1-128k
Moonshot: moonshot/moonshot-v1-32k-vision-preview
Moonshot: moonshot/moonshot-v1-8k-vision-preview
Moonshot: moonshot/moonshot-v1-8k
Moonshot: moonshot/kimi-thinking-preview
Moonshot: moonshot/moonshot-v1-32k
Moonshot: moonshot/kimi-k2-thinking
[[[end]]]

Fire up a chat:

llm chat -m moonshot/kimi-k2-0711-preview
Chatting with  moonshot/kimi-k2-0711-preview
Type 'exit' or 'quit' to exit
Type '!multi' to enter multiple lines, then '!end' to finish
> yo moonie
yo! what's up, moonie?
>

Need raw completion?

llm -m moonshot/moonshot-v1-8k "Finish this haiku: Neon city rain"
Neon city rain,
Glistening streets, a symphony,
Echoes of the night.

Reasoning Content Support

This plugin now supports reasoning content for Moonshot's thinking models (models with "thinking" in the name). When using thinking models, you'll see the model's reasoning process displayed in real-time before the final response:

llm chat -m moonshot/kimi-k2-thinking
[Reasoning] (shown in cyan dim)

The user is asking me to solve a complex problem. Let me think through this step by step...
First, I need to understand the core requirements...
Then I'll analyze the available options...

[Response] (shown in bold green)

Here's my well-reasoned answer to your question...

Available Thinking Models

  • moonshot/kimi-k2-thinking - Latest reasoning model
  • moonshot/kimi-thinking-preview - Preview reasoning model

The reasoning content helps you understand:

  • Decision-making process - See how the model analyzes problems
  • Multi-step reasoning - Follow complex thought chains
  • Error detection - Catch logical gaps or misunderstandings early

Aliases

Save your wrists:

llm aliases set kimi moonshot/kimi-latest

Now:

llm -m kimi "write a haiku about the AI chatbot Sidney is misbehaving"

Troubleshooting

Models don't appear in llm models list

  • Make sure you have set a Moonshot API key with llm keys set moonshot.
  • The plugin caches the model catalog for one hour. If the API was unreachable, it falls back to a built-in catalog.

Streaming connection dropped

  • The plugin automatically retries without streaming when Moonshot closes the connection mid-stream. This is a known upstream behavior and is handled transparently.

401 Unauthorized

Development

Clone, sync, build:

git clone https://github.com/zopyx/llm-moonshot.git
cd llm-moonshot
uv sync --extra dev
make check
make dist

To publish using .pypirc with twine:

make upload

See CONTRIBUTING.md for more details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zopyx_llm_moonshot-0.3.8.tar.gz (14.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

zopyx_llm_moonshot-0.3.8-py3-none-any.whl (11.7 kB view details)

Uploaded Python 3

File details

Details for the file zopyx_llm_moonshot-0.3.8.tar.gz.

File metadata

  • Download URL: zopyx_llm_moonshot-0.3.8.tar.gz
  • Upload date:
  • Size: 14.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for zopyx_llm_moonshot-0.3.8.tar.gz
Algorithm Hash digest
SHA256 bbc9d95ca1decdb0d5c2c58cab3ff3f76c9645ea54fa12430ea48e7e2c52213e
MD5 e0ffbcb42fb5efe6e4e9094a6818a8cd
BLAKE2b-256 16ebdcb2788bcfec63cde3cb424e6151d1f7ad0676bc03d7dbb835917bd8c38e

See more details on using hashes here.

Provenance

The following attestation bundles were made for zopyx_llm_moonshot-0.3.8.tar.gz:

Publisher: publish.yml on zopyx/llm-moonshot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file zopyx_llm_moonshot-0.3.8-py3-none-any.whl.

File metadata

File hashes

Hashes for zopyx_llm_moonshot-0.3.8-py3-none-any.whl
Algorithm Hash digest
SHA256 6e41d1069779f8dde0de53621f3fc80a99b17af8528dabcd414baa2023345f9c
MD5 2285d292130367678821354eecc25769
BLAKE2b-256 6f47dd922ca7f00501ee55335d5b916d2b291a8c49c1139c84b1af5fee71f69c

See more details on using hashes here.

Provenance

The following attestation bundles were made for zopyx_llm_moonshot-0.3.8-py3-none-any.whl:

Publisher: publish.yml on zopyx/llm-moonshot

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page