Skip to main content

A Package for running prompt decoders like RankVicuna

Project description

RankLLM

PyPI Downloads Downloads Generic badge LICENSE

We offer a suite of prompt-decoders, albeit with focus on open source LLMs compatible with FastChat (e.g., Vicuna, Zephyr, etc.). Some of the code in this repository is borrowed from RankGPT!

Releases

current_version = 0.2.8

📟 Instructions

Create Conda Environment

conda create -n rankllm python=3.10
conda activate rankllm

Install Pytorch with CUDA

pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118

Install Dependencies

pip install -r requirements.txt

Run end to end Test

python src/rank_llm/scripts/run_rank_llm.py  --model_path=castorini/rank_zephyr_7b_v1_full --top_k_candidates=100 --dataset=dl20 \
--retrieval_method=SPLADE++_EnsembleDistil_ONNX --prompt_mode=rank_GPT  --context_size=4096 --variable_passages

Contributing

If you would like to contribute to the project, please refer to the contribution guidelines.

🦙🐧 Model Zoo

The following is a table of our models hosted on HuggingFace:

Model Name Hugging Face Identifier/Link
RankZephyr 7B V1 - Full - BF16 castorini/rank_zephyr_7b_v1_full
RankVicuna 7B - V1 castorini/rank_vicuna_7b_v1
RankVicuna 7B - V1 - No Data Augmentation castorini/rank_vicuna_7b_v1_noda
RankVicuna 7B - V1 - FP16 castorini/rank_vicuna_7b_v1_fp16
RankVicuna 7B - V1 - No Data Augmentation - FP16 castorini/rank_vicuna_7b_v1_noda_fp16

✨ References

If you use RankLLM, please cite the following relevant papers:

[2309.15088] RankVicuna: Zero-Shot Listwise Document Reranking with Open-Source Large Language Models

@ARTICLE{pradeep2023rankvicuna,
  title   = {{RankVicuna}: Zero-Shot Listwise Document Reranking with Open-Source Large Language Models},
  author  = {Ronak Pradeep and Sahel Sharifymoghaddam and Jimmy Lin},
  year    = {2023},
  journal = {arXiv:2309.15088}
}

[2312.02724] RankZephyr: Effective and Robust Zero-Shot Listwise Reranking is a Breeze!

@ARTICLE{pradeep2023rankzephyr,
  title   = {{RankZephyr}: Effective and Robust Zero-Shot Listwise Reranking is a Breeze!},
  author  = {Ronak Pradeep and Sahel Sharifymoghaddam and Jimmy Lin},
  year    = {2023},
  journal = {arXiv:2312.02724}
}

🙏 Acknowledgments

This research is supported in part by the Natural Sciences and Engineering Research Council (NSERC) of Canada.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rank-llm-0.2.8.tar.gz (36.3 kB view details)

Uploaded Source

Built Distribution

rank_llm-0.2.8-py3-none-any.whl (39.9 kB view details)

Uploaded Python 3

File details

Details for the file rank-llm-0.2.8.tar.gz.

File metadata

  • Download URL: rank-llm-0.2.8.tar.gz
  • Upload date:
  • Size: 36.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.10.13

File hashes

Hashes for rank-llm-0.2.8.tar.gz
Algorithm Hash digest
SHA256 5f994a1c7f4b35db0065ed6b8009f9856bb0650bdc645d2f59814ba1f2e044f5
MD5 6cdb0af9423594357cb7a93edb6758a3
BLAKE2b-256 bcc165138be3e9632e267e040d513ad09332981aa758b2130547fdb2de0bfb30

See more details on using hashes here.

File details

Details for the file rank_llm-0.2.8-py3-none-any.whl.

File metadata

  • Download URL: rank_llm-0.2.8-py3-none-any.whl
  • Upload date:
  • Size: 39.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.10.13

File hashes

Hashes for rank_llm-0.2.8-py3-none-any.whl
Algorithm Hash digest
SHA256 d0ac72f01f2ca6063109470d0a6440d66137980912e22de507a30fb363d0b705
MD5 c6829512b6cb118ea91b7d3313ddf5f0
BLAKE2b-256 63cf8d46fe52fa13827373d30cb5b2a01f5d8d3f1da286f0e1b15a34c31df789

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page