Skip to main content

A ZMQ client interface for llama server

Project description

Llama Server Client is a Python package that provides a ZMQ client for interacting with a TCP Llama server that mimicsOpenAI’s chat completion API.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_server_client-1.2.2.tar.gz (14.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_server_client-1.2.2-py3-none-any.whl (25.8 kB view details)

Uploaded Python 3

File details

Details for the file llama_server_client-1.2.2.tar.gz.

File metadata

  • Download URL: llama_server_client-1.2.2.tar.gz
  • Upload date:
  • Size: 14.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.11.11

File hashes

Hashes for llama_server_client-1.2.2.tar.gz
Algorithm Hash digest
SHA256 6f2932577631b829511a846e1000a93fd88e0ad3fe9bf59063e6a8c4a5481f51
MD5 9bb38ed356289a60bda55fa0b8795fa4
BLAKE2b-256 eeea5b7b48838111d9a363176aee3374dbceb80914c0dc3b745cda3fc13fc039

See more details on using hashes here.

File details

Details for the file llama_server_client-1.2.2-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_server_client-1.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 b4bdee6468a6ddb52f7adf9a98f13019d839e4f3c12850fc5c57bae7df6480b9
MD5 7cf99d4e027a326950f6bc349c20e459
BLAKE2b-256 1b24823d18bfe6b41300196079df79c511531380a969a65570ca137f04e6cfda

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page