Last released May 21, 2024
Package for LLM Evaluation
Last released May 8, 2024
Last released Feb 27, 2024
PromptInject is a framework that assembles prompts in a modular fashion to provide a quantitative analysis of the robustness of LLMs to adversarial prompt attacks.
Supported by