Skip to main content

LLM2Vec-Gen: Generative Embeddings from Large Language Models

Project description

LLM2VEC-GEN: Generative Embeddings from Large Language Models

arxiv PyPI HF Link License: MIT WandB

LLM2Vec-Gen is a recipe to train interpretable, generative embeddings that encode the potential answer of an LLM to a query rather than the query itself.

Installation

Either use Pypi or clone the repository and install in editable mode:

pip install llm2vec-gen  # or pip install -e .

Usage

Load a pretrained model:

import torch
from llm2vec_gen import LLM2VecGenModel

model = LLM2VecGenModel.from_pretrained("McGill-NLP/LLM2Vec-Gen-Qwen3-8B")

As an example, you can use the model for retrieval using the following code snippet:

q_instruction = "Generate a passage that best answers this question: "
d_instruction = "Summarize the following passage: "

queries = [
  "where do polar bears live and what's their habitat",
  "what does disk cleanup mean on a computer"
]
q_reps = model.encode([q_instruction + q for q in queries])

documents = [
  "Polar bears live throughout the circumpolar North in the Arctic, spanning across Canada, Alaska (USA), Russia, Greenland, and Norway. Their primary habitat is sea ice over the continental shelf, which they use for hunting, mating, and traveling. They are marine mammals that rely on this environment to hunt seals.",
  "Disk Cleanup is a built-in Windows tool that frees up hard drive space by scanning for and deleting unnecessary files like temporary files, cached data, Windows updates, and items in the Recycle Bin. It improves computer performance by removing \"junk\" files, which can prevent the system from running slowly due to low storage.",
]
d_reps = model.encode([d_instruction + d for d in documents])

# Compute cosine similarity
q_reps_norm = torch.nn.functional.normalize(q_reps, p=2, dim=1)
d_reps_norm = torch.nn.functional.normalize(d_reps, p=2, dim=1)
cos_sim = torch.mm(q_reps_norm, d_reps_norm.transpose(0, 1))

print(cos_sim)
"""
tensor([[0.8750, 0.1182],
        [0.0811, 0.9336]])
"""

Note that in all examples, the instructions should be as if you are generating the answer to the input.
You may find other examples to try LLM2Vec-Gen in other tasks (e.g., classification and clustering) in the paper's GitHub repository.


LLM2Vec-Gen provides interpretable embeddings. You can use the following code to decode the content embedded in the embeddings:

_, recon_hidden_states = model.encode("what does disk cleanup mean on a computer", get_recon_hidden_states=True)
# recon_hidden_states: torch.Tensor with shape (1, compression token size, hidden_dim)

answer = model.generate(recon_hidden_states=recon_hidden_states, max_new_tokens=55)

print(answer)
"""
* **\n\n**Disk Cleanup** is a built-in utility in Windows that helps you **free up disk space** by **removing unnecessary files** from your computer. It is designed to clean up temporary files, system cache, and other files that are no longer needed.\n\n
"""

This code snippet will return the answer of the LLM2Vec-Gen model generated from the generative embeddings of the input (recon_hidden_states).

Citation

If you use this code, models, or data, please cite the LLM2Vec-Gen paper.

@article{behnamghader2026llm2vecgen,
  title={LLM2Vec-Gen: Generative Embeddings from Large Language Models},
  author={BehnamGhader, Parishad and Adlakha, Vaibhav and Schmidt, Fabian David and Chapados, Nicolas and Mosbach, Marius and Reddy, Siva},
  journal={arXiv preprint: arXiv:2603.10913},
  year={2026}
  url={https://arxiv.org/abs/2603.10913}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm2vec_gen-0.1.3.tar.gz (4.6 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm2vec_gen-0.1.3-py3-none-any.whl (23.9 kB view details)

Uploaded Python 3

File details

Details for the file llm2vec_gen-0.1.3.tar.gz.

File metadata

  • Download URL: llm2vec_gen-0.1.3.tar.gz
  • Upload date:
  • Size: 4.6 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.20

File hashes

Hashes for llm2vec_gen-0.1.3.tar.gz
Algorithm Hash digest
SHA256 77d4afb421e66aff8648822776f6061469d170073cb83b38ce839c2cffa8861e
MD5 71dc636f0d497286a037792d0be08586
BLAKE2b-256 23f3840fa2652eefa93aece1b82e95f77b427fd0448c2fbe39fd2f3845a77ef5

See more details on using hashes here.

File details

Details for the file llm2vec_gen-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: llm2vec_gen-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 23.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.20

File hashes

Hashes for llm2vec_gen-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 582f4e05a9cf6d19ccba17ff73e02d4d4b6d365faaa3d090d12337eae5078366
MD5 58aa04e23759cc01024fa88fab0ffbed
BLAKE2b-256 7e7647dc65ead0791ed0fa49f92149be26c9bbb9a1093ce999f0ee240637b492

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page