Neurocache: A library for augmenting language models with external memory.
Project description
Neurocache
A library for augmenting language models with external caching mechanisms
Requirements
- Python 3.6+
- PyTorch 1.13.0+
- Transformers 4.25.0+
Installation
pip install neurocache
Getting started
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
from neurocache import (
NeurocacheModelForCausalLM,
OnDeviceCacheConfig,
)
model_name = "facebook/opt-350m"
model = AutoModelForCausalLM.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
cache_layer_idx = model.config.num_hidden_layers - 5
config = OnDeviceCacheConfig(
cache_layers=[cache_layer_idx, cache_layer_idx + 3],
attention_layers=list(range(cache_layer_idx, model.config.num_hidden_layers)),
compression_factor=8,
topk=8,
)
model = NeurocacheModelForCausalLM(model, config)
input_text = ["Hello, my dog is cute", "Hello, my cat is cute"]
tokenized_input = tokenizer(input_text, return_tensors="pt")
tokenized_input["start_of_sequence"] = torch.tensor([0, 1]).bool()
outputs = model(**tokenized_input)
Supported model types
from neurocache.utils import NEUROCACHE_SUPPORTED_MODELS
print(NEUROCACHE_SUPPORTED_MODELS)
[
"opt",
"llama",
"mistral",
"gptj",
]
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
neurocache-0.0.1.tar.gz
(26.2 kB
view hashes)
Built Distribution
neurocache-0.0.1-py3-none-any.whl
(29.5 kB
view hashes)
Close
Hashes for neurocache-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 30eaed454d061804ef63f005fe99d0427c407475eac502b03adf8ecc75e94b40 |
|
MD5 | 0e3f4c837d302cea2cdf22eaa77462e0 |
|
BLAKE2b-256 | af8c685e65a2f7afca86992b738b0065077aa892a616ce71a749d8be8a8cc9c3 |