OpenLLM Core: Core components for OpenLLM.
Reason this release was yanked:
prompt_token_id bug for mutable objects
Project description
📖 Introduction
With OpenLLM, you can run inference with any open-source large-language models, deploy to the cloud or on-premises, and build powerful AI apps, and more.
To learn more about OpenLLM, please visit OpenLLM's README.md
This package holds the core components of OpenLLM, and considered as internal.
Components includes:
- Configuration generation.
- Utilities for interacting with OpenLLM server.
- Schema and generation utilities for OpenLLM server.
📔 Citation
If you use OpenLLM in your research, we provide a citation to use:
@software{Pham_OpenLLM_Operating_LLMs_2023,
author = {Pham, Aaron and Yang, Chaoyu and Sheng, Sean and Zhao, Shenyang and Lee, Sauyon and Jiang, Bo and Dong, Fog and Guan, Xipeng and Ming, Frost},
license = {Apache-2.0},
month = jun,
title = {{OpenLLM: Operating LLMs in production}},
url = {https://github.com/bentoml/OpenLLM},
year = {2023}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
openllm_core-0.5.0.tar.gz
(53.1 kB
view hashes)
Built Distribution
Close
Hashes for openllm_core-0.5.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 709d6e3931feb354b3bf20f06f6a71e3a45746342b5f2dc4f79bf9d08a3a5a72 |
|
MD5 | 1f391d18d4e856d02ec887a70ce98df6 |
|
BLAKE2b-256 | 490b2468b884102eafead12a75639f6db0283e7090c21dc6a5fc229e1407b54a |