LLM inference proxy server
Project description
LM-Proxy
LM-Proxy is OpenAI-compatible http proxy server for inferencing various LLMs capable of working with Google, Anthropic, OpenAI APIs, local PyTorch inference, etc.
Development Status: bookmark it and go away, it is still in early development.
✨ Features
- @todo
🚀 Quickstart
# Install LLM Console via pip
pip install <me>
🤝 Contributing
We ❤️ contributions! See CONTRIBUTING.md.
📝 License
Licensed under the MIT License.
© 2022—2025 Vitalii Stepanenko
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_proxy_server-0.0.2.tar.gz.
File metadata
- Download URL: llm_proxy_server-0.0.2.tar.gz
- Upload date:
- Size: 2.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
74adf6c0297f750a0abfc8821ba6fdc917d9d08f3d3d0fb0f5c50409cf675b0e
|
|
| MD5 |
aede9ef32635a0bf9dd2627a83f9becf
|
|
| BLAKE2b-256 |
91b273d9c1ed4cea5c88dd1315f164b3fa02ae39d1100528a6ce187c970087f3
|
File details
Details for the file llm_proxy_server-0.0.2-py3-none-any.whl.
File metadata
- Download URL: llm_proxy_server-0.0.2-py3-none-any.whl
- Upload date:
- Size: 3.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.1
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7429560c3cea8f6bcf0795e4cfa35543f2da921a56ec1d90b007e400f49baa8e
|
|
| MD5 |
0c35377ba77f2d0dbf3e3576e50c8266
|
|
| BLAKE2b-256 |
e0db85b4711dd9b9aa3ea629360706b5b532278115a81c802550277f7db485d7
|