Skip to main content

Keep the bloat out! - A lightweight LLM interaction library

Project description

tests

Python Test Status codecov PyPI version Documentation Status Code style: black pre-commit License: MIT Contributor Covenant

🤺 Fence

Fence is a simple, lightweight library for LLM communication. A lot of the functionality was inspired by/derived of LangChain (the OG LLM package) basics, since that's how the package was born - as a stripped down version of LangChain functionality, with cooler names.

🤔 Raison d'être

The simple answer: by accident. The slightly longer answer: LangChain used to be (is?) a pretty big package with a ton of dependencies. The upside is that it's powerful for PoC purposes, because it has it all.

The downsides:

  • It's big. It takes up a lot of space (which can be an issue in some environments/runtimes), often for functionality that isn't needed.
  • It's fairly complex. It's a big package with a lot of functionality, which can be overwhelming for new users.
  • It wasn't exactly dependable in an industrial setting before. Version jumps were common, and the package was often broken after a new release.

As a result, many developers (particularly those working in large production environments) have advocated for more lightweight, custom functionality that favors stability and robustness.

Circling back: why Fence?

Since our work was in a production environment, mostly dealing with Bedrock, we just started building some basic components from scratch. We needed a way to communicate with our models, which turned out to as the Link class (wink wink). Then, some other things were added left and right, and this eventually turned into a miniature package. Not in small part because it was fun to go down this road. But mostly because it strikes the right balance between convenience and flexiblity.

Naturally, it's nowhere as powerful as, for instance, LangChain. If you want to build a quick PoC with relatively complex logic, maybe go for the OG instead. If you want to be set on your way with a simple, lightweight package that's easy to understand and extend, Fence might be the way to go.

🛠️ How do I use it?

Fence just has a few basic components. See the notebooks for examples on how to use them. Documentation is coming soon, but for now, you can check out the source code for more details.

📦 Installation

You can install Fence from PyPI:

pip install fence-llm

👋 Look ma, no dependencies (kinda)!

Here's a hello world example:

from fence import Link
from fence.templates.string import StringTemplate
from fence.models.openai import GPT4omini

# Create a link
link = Link(
    model=GPT4omini(),
    template=StringTemplate("Write a poem about the value of a {topic}!"),
    name='hello_world_link'
)

# Run the link
output = link.run(topic='fence')['state']
print(output)

This will output something like:

[2024-10-04 17:45:15] [ℹ️ INFO] [links.run:203]              Executing <hello_world_link> Link
Sturdy wood and nails,
Boundaries draw peace and calm,
Guarding hearts within.

Much wow, very next level. There's more in the notebook section, with a lot more to cover!

💪 Features

What can I do with Fence?

  • Uniform interface for LLMs. Since our main use case was Bedrock, we built Fence to work with Bedrock models. However, it also has openAI support, and it's easy to extend to other models (contributors welcome!)
  • Links and Chains help you build complex pipelines with multiple models. This is a feature that's been around since LangChain, and it's still here. You can parametrize templates, and pass the output of one model to another.
  • Template classes that handle the basics, and that work across models (e.g., a MessageTemplate can be sent to a Bedrock Claude3 model, or to an openAI model - system/user/assistant formatting is handled under the hood).
  • Agents to move on to the sweet, sweet next level of LLM orchestration. Built using the ReAct pattern.
  • Basic utils on board for typical tasks like retries, parallelization, logging, output parsers, etc.

What can't I do with Fence?

It's obviously not as powerful as some of the other packages out there, that hold tons more of features. We're also not trying to fall into the trap of building 'yet another framework' (insert XKCD here), so we're trying to guard our scope. If you need a lot of bells and whistles, you might want to look at any of these:

The OG, no explanation needed.

A more recent package, with a lot of cool features! Great for building PoCs, too. Built by ex-AWS folks, and promises to be a lot more industry-oriented.

🗺️ Roadmap

  • Add more models (e.g., native Anthropic models)
  • Add more tests 😬
  • Add more notebook tutorials to showcase features

🤝 Contributing

We welcome contributions! Check out the CONTRIBUTING.md for more details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fence_llm-0.0.19.tar.gz (41.5 kB view details)

Uploaded Source

Built Distribution

fence_llm-0.0.19-py3-none-any.whl (59.9 kB view details)

Uploaded Python 3

File details

Details for the file fence_llm-0.0.19.tar.gz.

File metadata

  • Download URL: fence_llm-0.0.19.tar.gz
  • Upload date:
  • Size: 41.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for fence_llm-0.0.19.tar.gz
Algorithm Hash digest
SHA256 641590e87661f251c1d81c7ed45a118174d710a7a90c21496261e9cf112ae703
MD5 ba1a5012cf91ea8d329da29e15443c53
BLAKE2b-256 121b52185e64a0af2abee37195c8fd2656bbd2e09520c18bd5fd49226765e4d0

See more details on using hashes here.

File details

Details for the file fence_llm-0.0.19-py3-none-any.whl.

File metadata

  • Download URL: fence_llm-0.0.19-py3-none-any.whl
  • Upload date:
  • Size: 59.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for fence_llm-0.0.19-py3-none-any.whl
Algorithm Hash digest
SHA256 94b3f6c12553a0b7a3c41c936395b8243798bdedde6188b9a9dab31d66667b80
MD5 fe58c79ee610c3b74134ec42859b67c2
BLAKE2b-256 7de42f2b7218f4d61aff066456b23e8a3b7bd3a222c3d5c6e6e78ea4bb56c65d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page