Skip to main content

InPars

Project description

InPars

Inquisitive Parrots for Search
A toolkit for end-to-end synthetic data generation using LLMs for IR

Installation

Use pip package manager to install InPars toolkit.

pip install inpars

Usage

To generate data for one of the BEIR datasets, you can use the following command:

python -m inpars.generate \
        --prompt="inpars" \
        --dataset="trec-covid" \
        --dataset_source="ir_datasets" \
        --base_model="EleutherAI/gpt-j-6B" \
        --output="trec-covid-queries.jsonl" 

Additionally, you can use your own custom dataset by specifying the corpus and queries arguments to local files.

These generated queries might be noisy, thus a filtering step is highly recommended:

python -m inpars.filter \
        --input="trec-covid-queries.jsonl" \
        --dataset="trec-covid" \
        --filter_strategy="scores" \
        --keep_top_k="10_000" \
        --output="trec-covid-queries-filtered.jsonl"

There are currently two filtering strategies available: scores, which uses probability scores from the LLM itself, and reranker, which uses an auxiliary reranker to filter queries as introduced by InPars-v2.

To prepare the training file, negative examples are mined by retrieving candidate documents with BM25 using the generated queries and sampling from these candidates. This is done using the following command:

python -m inpars.generate_triples \
        --input="trec-covid-queries-filtered.jsonl" \
        --dataset="trec-covid" \
        --output="trec-covid-triples.tsv"

With the generated triples file, you can train the model using the following command:

python -m inpars.train \
        --triples="trec-covid-triples.tsv" \
        --base_model="castorini/monot5-3b-msmarco-10k" \
        --output_dir="./reranker/" \
        --max_steps="156"

You can choose different base models, hyperparameters, and training strategies that are supported by HuggingFace Trainer.

After finetuning the reranker, you can rerank prebuilt runs from the BEIR benchmark or specify a custom run using the following command:

python -m inpars.rerank \
        --model="./reranker/" \
        --dataset="trec-covid" \
        --output_run="trec-covid-run.txt"

Finally, you can evaluate the reranked run using the following command:

python -m inpars.evaluate \
        --dataset="trec-covid" \
        --run="trec-covid-run.txt"

Resources

Generated datasets

Download synthetic datasets generated by InPars-v1:

Finetuned models

Download finetuned models from InPars-v2 on HuggingFace Hub.

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

Please make sure to update tests as appropriate.

References

Currently, if you use this tool you can cite the original InPars paper published at SIGIR or InPars-v2.

@inproceedings{inpars,
  author = {Bonifacio, Luiz and Abonizio, Hugo and Fadaee, Marzieh and Nogueira, Rodrigo},
  title = {{InPars}: Unsupervised Dataset Generation for Information Retrieval},
  year = {2022},
  isbn = {9781450387323},
  publisher = {Association for Computing Machinery},
  address = {New York, NY, USA},
  url = {https://doi.org/10.1145/3477495.3531863},
  doi = {10.1145/3477495.3531863},
  booktitle = {Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval},
  pages = {2387–2392},
  numpages = {6},
  keywords = {generative models, large language models, question generation, synthetic datasets, few-shot models, multi-stage ranking},
  location = {Madrid, Spain},
  series = {SIGIR '22}
}
@misc{inparsv2,
  doi = {10.48550/ARXIV.2301.01820},
  url = {https://arxiv.org/abs/2301.01820},
  author = {Jeronymo, Vitor and Bonifacio, Luiz and Abonizio, Hugo and Fadaee, Marzieh and Lotufo, Roberto and Zavrel, Jakub and Nogueira, Rodrigo},
  title = {{InPars-v2}: Large Language Models as Efficient Dataset Generators for Information Retrieval},
  publisher = {arXiv},
  year = {2023},
  copyright = {Creative Commons Attribution 4.0 International}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

inpars-0.2.1.tar.gz (19.1 kB view hashes)

Uploaded Source

Built Distribution

inpars-0.2.1-py3-none-any.whl (21.6 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page