A Package for running prompt decoders like RankVicuna
Project description
RankLLM
We offer a suite of prompt-decoders, albeit with focus on open source LLMs compatible with FastChat (e.g., Vicuna, Zephyr, etc.). Some of the code in this repository is borrowed from RankGPT!
Releases
current_version = 0.12.8
📟 Instructions
Create Conda Environment
conda create -n rankllm python=3.10
conda activate rankllm
Install Pytorch with CUDA
pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118
Install Dependencies
pip install -r requirements.txt
Run end to end Test
python src/rank_llm/scripts/run_rank_llm.py --model_path=castorini/rank_zephyr_7b_v1_full --top_k_candidates=100 --dataset=dl20 \
--retrieval_method=SPLADE++_EnsembleDistil_ONNX --prompt_mode=rank_GPT --context_size=4096 --variable_passages
Contributing
If you would like to contribute to the project, please refer to the contribution guidelines.
🦙🐧 Model Zoo
The following is a table of our models hosted on HuggingFace:
Model Name | Hugging Face Identifier/Link |
---|---|
RankZephyr 7B V1 - Full - BF16 | castorini/rank_zephyr_7b_v1_full |
RankVicuna 7B - V1 | castorini/rank_vicuna_7b_v1 |
RankVicuna 7B - V1 - No Data Augmentation | castorini/rank_vicuna_7b_v1_noda |
RankVicuna 7B - V1 - FP16 | castorini/rank_vicuna_7b_v1_fp16 |
RankVicuna 7B - V1 - No Data Augmentation - FP16 | castorini/rank_vicuna_7b_v1_noda_fp16 |
✨ References
If you use RankLLM, please cite the following relevant papers:
@ARTICLE{pradeep2023rankvicuna,
title = {{RankVicuna}: Zero-Shot Listwise Document Reranking with Open-Source Large Language Models},
author = {Ronak Pradeep and Sahel Sharifymoghaddam and Jimmy Lin},
year = {2023},
journal = {arXiv:2309.15088}
}
[2312.02724] RankZephyr: Effective and Robust Zero-Shot Listwise Reranking is a Breeze!
@ARTICLE{pradeep2023rankzephyr,
title = {{RankZephyr}: Effective and Robust Zero-Shot Listwise Reranking is a Breeze!},
author = {Ronak Pradeep and Sahel Sharifymoghaddam and Jimmy Lin},
year = {2023},
journal = {arXiv:2312.02724}
}
🙏 Acknowledgments
This research is supported in part by the Natural Sciences and Engineering Research Council (NSERC) of Canada.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for rank_llm-0.12.8-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | fe51b985a07494e8ed85f35b0114aafe9c519610d2504865d31f869f2f1fc8d7 |
|
MD5 | f4ac9faf0fc396131f297a857e9f4dfe |
|
BLAKE2b-256 | 59cb9f018b2703c5b96775c70a461a14af340c54eeb0b8dec4e45dd0462a4aaf |