The web framework for building LLM microservices
Project description
⚠️ Disclaimer: This project is now in maintenance mode. I won't be adding new features or actively maintaining the project as I have moved on to other projects and priorities. While I will address critical bugs and security issues as needed, active development has ceased from my end. I do encourage the community to continue to contribute to the project if they find it useful. Thank you for using lanarky!
Lanarky is a python (3.9+) web framework for developers who want to build microservices using LLMs. Here are some of its key features:
- LLM-first: Unlike other web frameworks, lanarky is built specifically for LLM developers. It's unopinionated in terms of how you build your microservices and guarantees zero vendor lock-in with any LLM tooling frameworks or cloud providers
- Fast & Modern: Built on top of FastAPI, lanarky offers all the FastAPI features you know and love. If you are new to FastAPI, visit fastapi.tiangolo.com to learn more
- Streaming: Streaming is essential for many real-time LLM applications, like chatbots. Lanarky has got you covered with built-in streaming support over HTTP and WebSockets.
- Open-source: Lanarky is open-source and free to use. Forever.
To learn more about lanarky and get started, you can find the full documentation on lanarky.ajndkr.com
Installation
The library is available on PyPI and can be installed via pip
:
pip install lanarky
Getting Started
Lanarky provides a powerful abstraction layer to allow developers to build simple LLM microservices in just a few lines of code.
Here's an example to build a simple microservice that uses OpenAI's ChatCompletion
service:
from lanarky import Lanarky
from lanarky.adapters.openai.resources import ChatCompletionResource
from lanarky.adapters.openai.routing import OpenAIAPIRouter
app = Lanarky()
router = OpenAIAPIRouter()
@router.post("/chat")
def chat(stream: bool = True) -> ChatCompletionResource:
system = "You are a sassy assistant"
return ChatCompletionResource(stream=stream, system=system)
app.include_router(router)
Visit Getting Started for the full tutorial on building and testing your first LLM microservice with Lanarky.
Contributing
Contributions are more than welcome! If you have an idea for a new feature or want to help improve lanarky, please create an issue or submit a pull request on GitHub.
See CONTRIBUTING.md for more information.
License
The library is released under the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file lanarky-0.8.8.tar.gz
.
File metadata
- Download URL: lanarky-0.8.8.tar.gz
- Upload date:
- Size: 16.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.9.19 Linux/6.5.0-1022-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d4bee3c80608f2752fe111aea57299bac7ce893351741b36af05d2de5be0118f |
|
MD5 | 4a6dd5dfb40a7761cf694be3470e4c96 |
|
BLAKE2b-256 | 2a7eedf33bdb2970c23c22bec7cd8f9c7fe8ec14c2030023927ab78950ee9442 |
File details
Details for the file lanarky-0.8.8-py3-none-any.whl
.
File metadata
- Download URL: lanarky-0.8.8-py3-none-any.whl
- Upload date:
- Size: 23.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.3 CPython/3.9.19 Linux/6.5.0-1022-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b61781e7b3e89b228ef10b71994e5be5a7179936c898dc122ddcdc2e17a45bec |
|
MD5 | 65f4e6bf7338630eebd7f03ffcf0d786 |
|
BLAKE2b-256 | a600772c93fe1bc21a413f2f59fffb430d50f5ce2f74da148da12b238c1be243 |