Keep the bloat out! - A lightweight LLM interaction library
Project description
🤺 Fence
Fence
is a simple, lightweight library for LLM communication. A lot of the functionality was inspired by/derived of LangChain (the OG LLM package) basics, since that's how the package was born - as a stripped down version of LangChain functionality, with cooler names.
🤔 Raison d'être
The simple answer: by accident. The slightly longer answer: LangChain used to be (is?) a pretty big package with a ton of dependencies. The upside is that it's powerful for PoC purposes, because it has it all.
The downsides:
- It's big. It takes up a lot of space (which can be an issue in some environments/runtimes), often for functionality that isn't needed.
- It's fairly complex. It's a big package with a lot of functionality, which can be overwhelming for new users.
- It wasn't exactly dependable in an industrial setting before. Version jumps were common, and the package was often broken after a new release.
As a result, many developers (particularly those working in large production environments) have advocated for more lightweight, custom functionality that favors stability and robustness.
Circling back: why Fence?
Since our work was in a production environment, mostly dealing with Bedrock, we just started building some basic components from scratch. We needed a way to communicate with our models, which turned out to as the Link
class (wink wink).
Then, some other things were added left and right, and this eventually turned into a miniature package. Not in small part because it was fun to go down this road. But mostly because it strikes the right balance between convenience and flexiblity.
Naturally, it's nowhere as powerful as, for instance, LangChain. If you want to build a quick PoC with relatively complex logic, maybe go for the OG instead. If you want to be set on your way with a simple, lightweight package that's easy to understand and extend, Fence might be the way to go.
🛠️ How do I use it?
Fence just has a few basic components. See the notebooks for examples on how to use them. Documentation is coming soon, but for now, you can check out the source code for more details.
📦 Installation
You can install Fence from PyPI:
pip install fence-llm
👋 Look ma, no dependencies (kinda)!
Here's a hello world example:
from fence import Link
from fence.templates.string import StringTemplate
from fence.models.openai import GPT4omini
# Create a link
link = Link(
model=GPT4omini(),
template=StringTemplate("Write a poem about the value of a {topic}!"),
name='hello_world_link'
)
# Run the link
output = link.run(topic='fence')['state']
print(output)
This will output something like:
[2024-10-04 17:45:15] [ℹ️ INFO] [links.run:203] Executing <hello_world_link> Link
Sturdy wood and nails,
Boundaries draw peace and calm,
Guarding hearts within.
Much wow, very next level. There's more in the notebook section, with a lot more to cover!
💪 Features
What can I do with Fence?
- Uniform interface for
LLMs
. Since our main use case was Bedrock, we built Fence to work with Bedrock models. However, it also has openAI support, and it's easy to extend to other models (contributors welcome!) - Links and Chains help you build complex pipelines with multiple models. This is a feature that's been around since LangChain, and it's still here. You can parametrize templates, and pass the output of one model to another.
- Template classes that handle the basics, and that work across models (e.g., a MessageTemplate can be sent to a Bedrock Claude3 model, or to an openAI model - system/user/assistant formatting is handled under the hood).
- Agents to move on to the sweet, sweet next level of LLM orchestration. Built using the ReAct pattern.
- Basic utils on board for typical tasks like retries, parallelization, logging, output parsers, etc.
What can't I do with Fence?
It's obviously not as powerful as some of the other packages out there, that hold tons more of features. We're also not trying to fall into the trap of building 'yet another framework' (insert XKCD here), so we're trying to guard our scope. If you need a lot of bells and whistles, you might want to look at any of these:
The OG, no explanation needed.
A more recent package, with a lot of cool features! Great for building PoCs, too. Built by ex-AWS folks, and promises to be a lot more industry-oriented.
🗺️ Roadmap
- Add more models (e.g., native Anthropic models)
- Add more tests 😬
- Add more notebook tutorials to showcase features
🤝 Contributing
We welcome contributions! Check out the CONTRIBUTING.md for more details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file fence_llm-0.0.20.tar.gz
.
File metadata
- Download URL: fence_llm-0.0.20.tar.gz
- Upload date:
- Size: 41.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3007c0fde3a8864a2a07da9b0353c063d4ee661d13344ededbfc31a74ed831ad |
|
MD5 | 3b3dd2139104912b70e36b541d640371 |
|
BLAKE2b-256 | ae503e48c54dbae17af70f8d66af2b29da661f2259048520999ef457ee52010c |
File details
Details for the file fence_llm-0.0.20-py3-none-any.whl
.
File metadata
- Download URL: fence_llm-0.0.20-py3-none-any.whl
- Upload date:
- Size: 60.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5d66e0c3c32d65680b7b851e935bc05783a526a1519391c0e12cfe95abbf40cb |
|
MD5 | 2b0a7ad1f98608af55cc6448b9772f92 |
|
BLAKE2b-256 | e7e0e3a804730dcc1c8f531f147021f5652e066a6e17ac2c6fd75d921abac897 |