Skip to main content

LLM-Guard is a comprehensive tool designed to fortify the security of Large Language Models (LLMs). By offering sanitization, detection of harmful language, prevention of data leakage, and resistance against prompt injection attacks, LLM-Guard ensures that your interactions with LLMs remain safe and secure.

Project description

LLM Guard - The Security Toolkit for LLM Interactions

LLM-Guard is a comprehensive tool designed to fortify the security of Large Language Models (LLMs).

Documentation | Demo

MIT license Code style: black PyPI - Python Version Downloads Downloads Twitter

❤️ Proudly developed by the Laiyer.ai team.

What is LLM Guard?

LLM-Guard

By offering sanitization, detection of harmful language, prevention of data leakage, and resistance against prompt injection attacks, LLM-Guard ensures that your interactions with LLMs remain safe and secure.

Installation

Begin your journey with LLM Guard by downloading the package and acquiring the en_core_web_trf spaCy model (essential for the Anonymize scanner):

pip install llm-guard
python -m spacy download en_core_web_trf

Getting Started

Important Notes:

  • LLM Guard is designed for easy integration and deployment in production environments. While it's ready to use out-of-the-box, please be informed that we're constantly improving and updating the repository.
  • Base functionality requires a limited number of libraries. As you explore more advanced features, necessary libraries will be automatically installed.
  • Ensure you're using Python version 3.8.1 or higher. Confirm with: python --version.
  • Library installation issues? Consider upgrading pip: python -m pip install --upgrade pip.

Examples:

Supported scanners

Prompt scanners

Output scanners

Roadmap

General:

  • Introduce support of GPU
  • Improve documentation by showing use-cases, benchmarks, etc
  • Hosted version of LLM Guard
  • Text statistics to provide on prompt and output
  • Support more languages
  • Accept multiple outputs instead of one to compare
  • Support streaming mode

Prompt Scanner:

  • Integrate with Perspective API for Toxicity scanner
  • Develop language restricting scanner

Output Scanner:

  • Develop output scanners for the format (e.g. max length, correct JSON, XML, etc)
  • Develop factual consistency scanner
  • Develop libraries hallucination scanner
  • Develop libraries licenses scanner

Contributing

Got ideas, feedback, or wish to contribute? We'd love to hear from you! Email us.

For detailed guidelines on contributions, kindly refer to our contribution guide.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm-guard-0.2.1.tar.gz (41.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_guard-0.2.1-py3-none-any.whl (63.2 kB view details)

Uploaded Python 3

File details

Details for the file llm-guard-0.2.1.tar.gz.

File metadata

  • Download URL: llm-guard-0.2.1.tar.gz
  • Upload date:
  • Size: 41.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.3

File hashes

Hashes for llm-guard-0.2.1.tar.gz
Algorithm Hash digest
SHA256 8639da142505567d0dadc11d1611e3422db1b445ab11cb532309ecc1590b406a
MD5 708f077ecc5509b6a4dac95708bc8610
BLAKE2b-256 2ce5df1783f455fff7f6e33c42d81e2218df3560d4e4c1044b3024938bb05880

See more details on using hashes here.

File details

Details for the file llm_guard-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: llm_guard-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 63.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.3

File hashes

Hashes for llm_guard-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 399a87555affc57be3a9a3c5f79923c7dab0540b92514c1c44ed728e8d0e7101
MD5 3cf0b3bb7ab3c00ff239a3f5f99d66bb
BLAKE2b-256 190a0497bbdf079b3905b26dae4ba810d5ae17d2cc0cd91dfd7eb376ae3f3c1e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page