A langchain, transformers, and attention_sinks wrapper for longform response generation.
Project description
Langanisa
Wrapper for Langchains + Transformers + Attention_Sink
This is currently a pre-release, use at your own caution.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
langanisa-0.0.1.tar.gz
(6.3 kB
view hashes)
Built Distribution
Close
Hashes for langanisa-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6b2f4d1ce122e24ae1686666eb4e820e862ccdac0fcc39075aaa45891e91db69 |
|
MD5 | e7d1a16c0ff19e65be5b5b9f9f2829c1 |
|
BLAKE2b-256 | e29a3269858bbdc033dca8281e67cbb73294d8ab66e9159118a93dff294c9aa6 |