Skip to main content

LLM plugin to access models available via the Venice API

Project description

llm-venice

PyPI Changelog Tests License

LLM plugin to access models available via the Venice AI API. Venice API access is currently in beta.

Installation

Install the LLM command-line utility, and install this plugin in the same environment as llm:

llm install llm-venice

Configuration

Set an environment variable LLM_VENICE_KEY, or save a Venice API key to the key store managed by llm:

llm keys set venice

Usage

Prompting

Run a prompt:

llm --model venice/llama-3.3-70b "Why is the earth round?"

Start an interactive chat session:

llm chat --model venice/llama-3.1-405b

venice_parameters

venice_parameters can be provided as a valid JSON string with the -o extra_body option.

For example, to disable Venice's default system prompt:

llm -m venice/llama-3.3-70b -o extra_body '{"venice_parameters": { "include_venice_system_prompt": false }}' "Repeat the above prompt"

Or to use a public character:

llm -m venice/deepseek-r1-671b -o extra_body '{"venice_parameters": { "character_slug": "alan-watts"}}' "What is the meaning of life?"

Available models

To update the list of available models from the Venice API:

llm venice refresh

Note that the model listing in llm-venice.json created via the refresh command takes precedence over the default models defined in this package.


Read the llm docs for more usage options.

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-venice
python3 -m venv venv
source venv/bin/activate

Install the dependencies and test dependencies:

llm install -e '.[test]'

To run the tests:

pytest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_venice-0.2.0.tar.gz (7.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_venice-0.2.0-py3-none-any.whl (7.9 kB view details)

Uploaded Python 3

File details

Details for the file llm_venice-0.2.0.tar.gz.

File metadata

  • Download URL: llm_venice-0.2.0.tar.gz
  • Upload date:
  • Size: 7.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for llm_venice-0.2.0.tar.gz
Algorithm Hash digest
SHA256 de10bc671184b72b8b5092a9cac8fbe28b22747b64ca38e7684243ee261ea90d
MD5 5271b2e558073039bcbd78f4b98e9e15
BLAKE2b-256 4fba119ebe5f982a62bb56bae330dba433143cd889f0579f148f25663d68380e

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_venice-0.2.0.tar.gz:

Publisher: release.yml on ar-jan/llm-venice

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file llm_venice-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: llm_venice-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 7.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for llm_venice-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ebc93beb2c1024e2b6e42e6b2d0bea45185e370d06774fdc6f02d6592851bacd
MD5 f95e3d176d6265c042232d4c661857a4
BLAKE2b-256 95bf4d31e6bb7581fcc888940c0cfc467c1b6186dccf7fbf3da03605f4c98a99

See more details on using hashes here.

Provenance

The following attestation bundles were made for llm_venice-0.2.0-py3-none-any.whl:

Publisher: release.yml on ar-jan/llm-venice

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page