An LLM inference solution to quickly deploy productive LLM service
Project description
The author of this package has not provided a project description
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
llm-serve-0.0.1.tar.gz
(95.0 kB
view hashes)
Built Distribution
llm_serve-0.0.1-py3-none-any.whl
(114.9 kB
view hashes)
Close
Hashes for llm_serve-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | b2176a89759929ba4af2c0a602ac6cff2e4c211fc42a0d9c188a9f22f6a8d226 |
|
MD5 | 04819ca2c42cb33d74fc19a47505cbc0 |
|
BLAKE2b-256 | 00844b356d3b866cfea971717fc70a4354cab57797d2161849d72887194b460a |