Skip to main content

A Package for running prompt decoders like RankVicuna

Project description

RankLLM

PyPI Downloads Downloads Generic badge LICENSE

We offer a suite of prompt-decoders, albeit with focus on open source LLMs compatible with FastChat (e.g., Vicuna, Zephyr, etc.). Some of the code in this repository is borrowed from RankGPT!

Releases

current_version = 0.12.8

📟 Instructions

Create Conda Environment

conda create -n rankllm python=3.10
conda activate rankllm

Install Pytorch with CUDA

pip3 install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118

Install Dependencies

pip install -r requirements.txt

Run end to end Test

python src/rank_llm/scripts/run_rank_llm.py  --model_path=castorini/rank_zephyr_7b_v1_full --top_k_candidates=100 --dataset=dl20 \
--retrieval_method=SPLADE++_EnsembleDistil_ONNX --prompt_mode=rank_GPT  --context_size=4096 --variable_passages

Contributing

If you would like to contribute to the project, please refer to the contribution guidelines.

🦙🐧 Model Zoo

The following is a table of our models hosted on HuggingFace:

Model Name Hugging Face Identifier/Link
RankZephyr 7B V1 - Full - BF16 castorini/rank_zephyr_7b_v1_full
RankVicuna 7B - V1 castorini/rank_vicuna_7b_v1
RankVicuna 7B - V1 - No Data Augmentation castorini/rank_vicuna_7b_v1_noda
RankVicuna 7B - V1 - FP16 castorini/rank_vicuna_7b_v1_fp16
RankVicuna 7B - V1 - No Data Augmentation - FP16 castorini/rank_vicuna_7b_v1_noda_fp16

✨ References

If you use RankLLM, please cite the following relevant papers:

[2309.15088] RankVicuna: Zero-Shot Listwise Document Reranking with Open-Source Large Language Models

@ARTICLE{pradeep2023rankvicuna,
  title   = {{RankVicuna}: Zero-Shot Listwise Document Reranking with Open-Source Large Language Models},
  author  = {Ronak Pradeep and Sahel Sharifymoghaddam and Jimmy Lin},
  year    = {2023},
  journal = {arXiv:2309.15088}
}

[2312.02724] RankZephyr: Effective and Robust Zero-Shot Listwise Reranking is a Breeze!

@ARTICLE{pradeep2023rankzephyr,
  title   = {{RankZephyr}: Effective and Robust Zero-Shot Listwise Reranking is a Breeze!},
  author  = {Ronak Pradeep and Sahel Sharifymoghaddam and Jimmy Lin},
  year    = {2023},
  journal = {arXiv:2312.02724}
}

🙏 Acknowledgments

This research is supported in part by the Natural Sciences and Engineering Research Council (NSERC) of Canada.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rank_llm-0.12.8.tar.gz (41.1 kB view details)

Uploaded Source

Built Distribution

rank_llm-0.12.8-py3-none-any.whl (44.8 kB view details)

Uploaded Python 3

File details

Details for the file rank_llm-0.12.8.tar.gz.

File metadata

  • Download URL: rank_llm-0.12.8.tar.gz
  • Upload date:
  • Size: 41.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for rank_llm-0.12.8.tar.gz
Algorithm Hash digest
SHA256 8296d57b6afadb7d79cfb54ac2cf0f9eb7bc122b65b48d64b2b41c238f01e8a0
MD5 83e06565d051eaba1f5baaca3649a7ae
BLAKE2b-256 94fee55ffebc4774135654210e89b90bf033fefa66444a827b3ecd0c55500bc8

See more details on using hashes here.

Provenance

File details

Details for the file rank_llm-0.12.8-py3-none-any.whl.

File metadata

  • Download URL: rank_llm-0.12.8-py3-none-any.whl
  • Upload date:
  • Size: 44.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for rank_llm-0.12.8-py3-none-any.whl
Algorithm Hash digest
SHA256 fe51b985a07494e8ed85f35b0114aafe9c519610d2504865d31f869f2f1fc8d7
MD5 f4ac9faf0fc396131f297a857e9f4dfe
BLAKE2b-256 59cb9f018b2703c5b96775c70a461a14af340c54eeb0b8dec4e45dd0462a4aaf

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page