Skip to main content

A Package for running prompt decoders like RankVicuna

Project description

RankLLM

PyPI Downloads Downloads Generic badge LICENSE

We offer a suite of prompt decoders, albeit with a current focus on RankVicuna. Some of the code in this repository is borrowed from RankGPT!

Releases

current_version = 0.2.7

📟 Instructions

Create Conda Environment

conda create -n rankllm python=3.10
conda activate rankllm

Install Pytorch with CUDA

pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118

Install Dependencies

pip install -r requirements.txt

Run end to end Test

python src/rank_llm/scripts/run_rank_llm.py  --model_path=castorini/rank_zephyr_7b_v1_full --top_k_candidates=100 --dataset=dl20 \
--retrieval_method=SPLADE++_EnsembleDistil_ONNX --prompt_mode=rank_GPT  --context_size=4096 --variable_passages

Contributing

If you would like to contribute to the project, please refer to the contribution guidelines.

🦙🐧 Model Zoo

The following is a table of our models hosted on HuggingFace:

Model Name Hugging Face Identifier/Link
RankZephyr 7B V1 - Full - BF16 castorini/rank_zephyr_7b_v1_full
RankVicuna 7B - V1 castorini/rank_vicuna_7b_v1
RankVicuna 7B - V1 - No Data Augmentation castorini/rank_vicuna_7b_v1_noda
RankVicuna 7B - V1 - FP16 castorini/rank_vicuna_7b_v1_fp16
RankVicuna 7B - V1 - No Data Augmentation - FP16 castorini/rank_vicuna_7b_v1_noda_fp16

✨ References

If you use RankLLM, please cite the following relevant papers:

[2309.15088] RankVicuna: Zero-Shot Listwise Document Reranking with Open-Source Large Language Models

@ARTICLE{pradeep2023rankvicuna,
  title   = {{RankVicuna}: Zero-Shot Listwise Document Reranking with Open-Source Large Language Models},
  author  = {Ronak Pradeep and Sahel Sharifymoghaddam and Jimmy Lin},
  year    = {2023},
  journal = {arXiv:2309.15088}
}

[2312.02724] RankZephyr: Effective and Robust Zero-Shot Listwise Reranking is a Breeze!

@ARTICLE{pradeep2023rankzephyr,
  title   = {{RankZephyr}: Effective and Robust Zero-Shot Listwise Reranking is a Breeze!},
  author  = {Ronak Pradeep and Sahel Sharifymoghaddam and Jimmy Lin},
  year    = {2023},
  journal = {arXiv:2312.02724}
}

🙏 Acknowledgments

This research is supported in part by the Natural Sciences and Engineering Research Council (NSERC) of Canada.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rank-llm-0.2.7.tar.gz (36.1 kB view details)

Uploaded Source

Built Distribution

rank_llm-0.2.7-py3-none-any.whl (39.8 kB view details)

Uploaded Python 3

File details

Details for the file rank-llm-0.2.7.tar.gz.

File metadata

  • Download URL: rank-llm-0.2.7.tar.gz
  • Upload date:
  • Size: 36.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.13

File hashes

Hashes for rank-llm-0.2.7.tar.gz
Algorithm Hash digest
SHA256 47360141c9ac4bb6316d8300e26ea669e0c23f82d67866cecd25b2cc6fc3e67b
MD5 858a2d3aeed612235703134e9e2ffabd
BLAKE2b-256 e025cb8aa5926a6ce2fcdbad6abb51e3ee936afad3725684839c20372886c42b

See more details on using hashes here.

File details

Details for the file rank_llm-0.2.7-py3-none-any.whl.

File metadata

  • Download URL: rank_llm-0.2.7-py3-none-any.whl
  • Upload date:
  • Size: 39.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.13

File hashes

Hashes for rank_llm-0.2.7-py3-none-any.whl
Algorithm Hash digest
SHA256 2b2b5bc2e0b39341cfe902e05b51dd2b7fd0715c4920455d1e26ca5d46b7b253
MD5 000b4022c657ca2e8ced0d7832162d93
BLAKE2b-256 9ed7da040c8929979fd181314c5b02559d0a89bd38799c83e217b781552f4314

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page