Skip to main content

LLM plugin for models hosted by NOS Cloud

Project description

llm-nosrun

PyPI Changelog Tests License

LLM plugin for models hosted by NOS Cloud

Installation

First, install the LLM command-line utility.

Now install this plugin in the same environment as LLM.

llm install llm-nosrun

Configuration

You will need an API key from NOS Cloud. You can obtain one here.

You can set that as an environment variable called LLM_NOSRUN_KEY, or add it to the llm set of saved keys using:

llm keys set nosrun
Enter key: <paste key here>

Usage

To list available models, run:

llm models list

You should see a list that looks something like this:

NOSRun: TinyLlama/TinyLlama-1.1B-Chat-v1.0
NOSRun: meta-llama/Llama-2-7b-chat-hf
NOSRun: meta-llama/Llama-2-13b-chat-hf
NOSRun: meta-llama/Llama-2-70b-chat-hf
NOSRun: HuggingFaceH4/zephyr-7b-beta
NOSRun: HuggingFaceH4/tiny-random-LlamaForCausalLM
NOSRun: NousResearch/Yarn-Mistral-7b-128k
NOSRun: mistralai/Mistral-7B-Instruct-v0.2
NOSRun: mistralai/Mixtral-8x7B-Instruct-v0.1
NOSRun: TheBloke/TinyLlama-1.1B-Chat-v1.0-AWQ
NOSRun: TheBloke/Mixtral-8x7B-Instruct-v0.1-AWQ
NOSRun: mlabonne/phixtral-2x2_8
NOSRun: mlabonne/phixtral-4x2_8

To run a prompt against a model, pass its full model ID to the -m option, like this:

llm -m TinyLlama/TinyLlama-1.1B-Chat-v1.0 \
  'Five strident names for a pet walrus' \
  --system 'You love coming up with creative names for pets'

You can set a shorter alias for a model using the llm aliases command like so:

llm aliases set tinyllama TinyLlama/TinyLlama-1.1B-Chat-v1.0

Now you can prompt Llama 2 70B using:

cat llm_nosrun.py | \
  llm -m tinyllama -s 'explain this code'

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-nosrun
python3 -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

pip install -e '.[test]'

To run the tests:

pytest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm-nosrun-0.2.tar.gz (7.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_nosrun-0.2-py3-none-any.whl (7.3 kB view details)

Uploaded Python 3

File details

Details for the file llm-nosrun-0.2.tar.gz.

File metadata

  • Download URL: llm-nosrun-0.2.tar.gz
  • Upload date:
  • Size: 7.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.10

File hashes

Hashes for llm-nosrun-0.2.tar.gz
Algorithm Hash digest
SHA256 a806308459e78b69d0443928057395a773572229f5a8d1e24f5345ca05dc15ec
MD5 f48d3bee33e0bd6e0f30729efcd793f0
BLAKE2b-256 13d716b8635e612ab30d9f74e3613460c5cdbb54ef340052d5436a8381c47f46

See more details on using hashes here.

File details

Details for the file llm_nosrun-0.2-py3-none-any.whl.

File metadata

  • Download URL: llm_nosrun-0.2-py3-none-any.whl
  • Upload date:
  • Size: 7.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.10

File hashes

Hashes for llm_nosrun-0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 e0522e94ecd68d61767e268348db4d25b40f47a40c5337443d995e1248bb0db4
MD5 e9df6717d90be77a271a29b0be6003fc
BLAKE2b-256 42a0845737ed1c1d7e997aa6af6dd4084ab91fe214f84bb4fba85c72d6ae720b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page