Skip to main content

OpenLLM Client: Interacting with OpenLLM HTTP/gRPC server, or any BentoML server.

Project description

Banner for OpenLLM

👾 OpenLLM Client

pypi_status test_pypi_status Twitter Discord ci pre-commit.ci status
python_version Hatch code style Ruff types - mypy types - pyright

OpenLLM Client: Interacting with OpenLLM HTTP/gRPC server, or any BentoML server.

📖 Introduction

With OpenLLM, you can run inference with any open-source large-language models, deploy to the cloud or on-premises, and build powerful AI apps, and more.

To learn more about OpenLLM, please visit OpenLLM's README.md

This package holds the underlying client implementation for OpenLLM. If you are coming from OpenLLM, the client can be accessed via openllm.client.

It provides somewhat of a "similar" APIs to bentoml.Client (via openllm_client.min) for interacting with OpenLLM server. This can also be extended to use with general BentoML server as well.

[!NOTE] The component of interop with generic BentoML server will be considered as EXPERIMENTAL and will be refactored to new client implementation soon! If you are just using this package for interacting with OpenLLM server, The API should be the same as openllm.client namespace.

import openllm

client = openllm.client.HTTPClient()

client.query('Explain to me the difference between "further" and "farther"')

Gif showing OpenLLM Intro

Gif showing Agent integration

📔 Citation

If you use OpenLLM in your research, we provide a citation to use:

@software{Pham_OpenLLM_Operating_LLMs_2023,
author = {Pham, Aaron and Yang, Chaoyu and Sheng, Sean and  Zhao, Shenyang and Lee, Sauyon and Jiang, Bo and Dong, Fog and Guan, Xipeng and Ming, Frost},
license = {Apache-2.0},
month = jun,
title = {{OpenLLM: Operating LLMs in production}},
url = {https://github.com/bentoml/OpenLLM},
year = {2023}
}

Click me for full changelog

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openllm_client-0.4.6.tar.gz (21.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openllm_client-0.4.6-py3-none-any.whl (19.2 kB view details)

Uploaded Python 3

File details

Details for the file openllm_client-0.4.6.tar.gz.

File metadata

  • Download URL: openllm_client-0.4.6.tar.gz
  • Upload date:
  • Size: 21.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for openllm_client-0.4.6.tar.gz
Algorithm Hash digest
SHA256 79813161259ab02813f656c9ca75c806358910b9d9ea6c32aacbfbb717568bb2
MD5 050aeff5c03b90bc5ccf41d4bbbb85e6
BLAKE2b-256 72f6a7edea154f89188dfdfb1e6d41c29430f4b20f729e2d3e47c50fa354fa37

See more details on using hashes here.

File details

Details for the file openllm_client-0.4.6-py3-none-any.whl.

File metadata

  • Download URL: openllm_client-0.4.6-py3-none-any.whl
  • Upload date:
  • Size: 19.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for openllm_client-0.4.6-py3-none-any.whl
Algorithm Hash digest
SHA256 19b73b7dd522b41918ae194842cdb89a392b980b3fcaeeb546365cffec781100
MD5 63265f3297f23edd3313eaa82fca9875
BLAKE2b-256 f77c39dcc3af8392d1355a3ada6f9bbcc213259fc293ee5d269fb4c17fe5761b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page