Skip to main content

OpenLLM Client: Interacting with OpenLLM HTTP/gRPC server, or any BentoML server.

Project description

Banner for OpenLLM

👾 OpenLLM Client

pypi_status test_pypi_status Twitter Discord ci pre-commit.ci status
python_version Hatch code style Ruff types - mypy types - pyright

OpenLLM Client: Interacting with OpenLLM HTTP/gRPC server, or any BentoML server.

📖 Introduction

With OpenLLM, you can run inference with any open-source large-language models, deploy to the cloud or on-premises, and build powerful AI apps, and more.

To learn more about OpenLLM, please visit OpenLLM's README.md

This package holds the underlying client implementation for OpenLLM. If you are coming from OpenLLM, the client can be accessed via openllm.client.

It provides somewhat of a "similar" APIs to bentoml.Client (via openllm_client.min) for interacting with OpenLLM server. This can also be extended to use with general BentoML server as well.

[!NOTE] The component of interop with generic BentoML server will be considered as EXPERIMENTAL and will be refactored to new client implementation soon! If you are just using this package for interacting with OpenLLM server, The API should be the same as openllm.client namespace.

import openllm

client = openllm.client.HTTPClient()

client.query('Explain to me the difference between "further" and "farther"')

Gif showing OpenLLM Intro

Gif showing Agent integration

📔 Citation

If you use OpenLLM in your research, we provide a citation to use:

@software{Pham_OpenLLM_Operating_LLMs_2023,
author = {Pham, Aaron and Yang, Chaoyu and Sheng, Sean and  Zhao, Shenyang and Lee, Sauyon and Jiang, Bo and Dong, Fog and Guan, Xipeng and Ming, Frost},
license = {Apache-2.0},
month = jun,
title = {{OpenLLM: Operating LLMs in production}},
url = {https://github.com/bentoml/OpenLLM},
year = {2023}
}

Click me for full changelog

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openllm_client-0.4.33.tar.gz (21.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openllm_client-0.4.33-py3-none-any.whl (19.3 kB view details)

Uploaded Python 3

File details

Details for the file openllm_client-0.4.33.tar.gz.

File metadata

  • Download URL: openllm_client-0.4.33.tar.gz
  • Upload date:
  • Size: 21.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for openllm_client-0.4.33.tar.gz
Algorithm Hash digest
SHA256 d4e83aa36318b99b95a5c9b2a3fbed2b64f328f9d834ed3514c507f9cddf240c
MD5 cf34d9bc98909f4abc78d05bf8a4a16e
BLAKE2b-256 8d52ceee1b2f4570a068328713be8ca90bcdc82cb1c5d2973d1e35856b2d762a

See more details on using hashes here.

File details

Details for the file openllm_client-0.4.33-py3-none-any.whl.

File metadata

File hashes

Hashes for openllm_client-0.4.33-py3-none-any.whl
Algorithm Hash digest
SHA256 9a2d23ed37d6fc04211cb35b07477891fd6a48e116dc1ffe0d0ad71e78362a83
MD5 2e95e913fb48917db589624f622ca63e
BLAKE2b-256 d98ba5667b0dfdb78753e1e3a18341307e43911b752d592cb054e4e9739b67a7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page