Skip to main content

OpenLLM Client: Interacting with OpenLLM HTTP/gRPC server, or any BentoML server.

Project description

Banner for OpenLLM

👾 OpenLLM Client

pypi_status test_pypi_status Twitter Discord ci pre-commit.ci status
python_version Hatch code style Ruff types - mypy types - pyright

OpenLLM Client: Interacting with OpenLLM HTTP/gRPC server, or any BentoML server.

📖 Introduction

With OpenLLM, you can run inference with any open-source large-language models, deploy to the cloud or on-premises, and build powerful AI apps, and more.

To learn more about OpenLLM, please visit OpenLLM's README.md

This package holds the underlying client implementation for OpenLLM. If you are coming from OpenLLM, the client can be accessed via openllm.client.

It provides somewhat of a "similar" APIs to bentoml.Client (via openllm_client.benmin) for interacting with OpenLLM server. This can also be extended to use with general BentoML server as well.

[!NOTE] The component of interop with generic BentoML server will be considered as experimental that will/can be merged back to BentoML. If you are just using this package for interacting with OpenLLM server, nothing should change from openllm.client namespace.

import openllm

client = openllm.client.HTTPClient()

client.query('Explain to me the difference between "further" and "farther"')

Gif showing OpenLLM Intro

Gif showing Agent integration

📔 Citation

If you use OpenLLM in your research, we provide a citation to use:

@software{Pham_OpenLLM_Operating_LLMs_2023,
author = {Pham, Aaron and Yang, Chaoyu and Sheng, Sean and  Zhao, Shenyang and Lee, Sauyon and Jiang, Bo and Dong, Fog and Guan, Xipeng and Ming, Frost},
license = {Apache-2.0},
month = jun,
title = {{OpenLLM: Operating LLMs in production}},
url = {https://github.com/bentoml/OpenLLM},
year = {2023}
}

Click me for full changelog

Project details


Release history Release notifications | RSS feed

This version

0.3.1

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

openllm_client-0.3.1.tar.gz (17.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

openllm_client-0.3.1-py3-none-any.whl (18.1 kB view details)

Uploaded Python 3

File details

Details for the file openllm_client-0.3.1.tar.gz.

File metadata

  • Download URL: openllm_client-0.3.1.tar.gz
  • Upload date:
  • Size: 17.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for openllm_client-0.3.1.tar.gz
Algorithm Hash digest
SHA256 33ec46e88ecaee5938fdc4f1fb2ea70a537215cfdc2622acd0a2503bdc9a231c
MD5 3887994bea474e68f136e642ae1b2187
BLAKE2b-256 249d2928ff68acd8f13b30ef63db0762e281426f4f44ff702d4f0c0ad4e146ec

See more details on using hashes here.

File details

Details for the file openllm_client-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: openllm_client-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 18.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for openllm_client-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 9a053a70e75d04eb8e180bf078102e4615351268aaf75d95356d10f61ae2f1fb
MD5 831541d4ad8eb795d481b6de1489af70
BLAKE2b-256 8efff998c5dd5efc07ad99c8dbcf0015939ac2301f6f64dd595041a1cb901837

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page