Skip to main content

LLM-Guard is a comprehensive tool designed to fortify the security of Large Language Models (LLMs). By offering sanitization, detection of harmful language, prevention of data leakage, and resistance against prompt injection attacks, LLM-Guard ensures that your interactions with LLMs remain safe and secure.

Project description

LLM Guard - The Security Toolkit for LLM Interactions

LLM Guard by Laiyer.ai is a comprehensive tool designed to fortify the security of Large Language Models (LLMs).

Documentation | Demo

MIT license Code style: black PyPI - Python Version Downloads Downloads Twitter

Production Support / Help for companies

We're eager to provide personalized assistance when deploying your LLM Guard to a production environment.

What is LLM Guard?

LLM-Guard

By offering sanitization, detection of harmful language, prevention of data leakage, and resistance against prompt injection attacks, LLM-Guard ensures that your interactions with LLMs remain safe and secure.

Installation

Begin your journey with LLM Guard by downloading the package:

pip install llm-guard

And then download a preferred spaCy model for Anonymize scanner. By default, you can use:

pip install https://huggingface.co/beki/en_spacy_pii_fast/resolve/main/en_spacy_pii_fast-any-py3-none-any.whl

Getting Started

Important Notes:

  • LLM Guard is designed for easy integration and deployment in production environments. While it's ready to use out-of-the-box, please be informed that we're constantly improving and updating the repository.
  • Base functionality requires a limited number of libraries. As you explore more advanced features, necessary libraries will be automatically installed.
  • Ensure you're using Python version 3.8.1 or higher. Confirm with: python --version.
  • Library installation issues? Consider upgrading pip: python -m pip install --upgrade pip.

Examples:

Supported scanners

Prompt scanners

Output scanners

Roadmap

You can find our roadmap here. Please don't hesitate to contribute or create issues, it helps us improve LLM Guard!

Contributing

Got ideas, feedback, or wish to contribute? We'd love to hear from you! Email us.

For detailed guidelines on contributions, kindly refer to our contribution guide.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm-guard-0.3.0.tar.gz (62.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_guard-0.3.0-py3-none-any.whl (131.3 kB view details)

Uploaded Python 3

File details

Details for the file llm-guard-0.3.0.tar.gz.

File metadata

  • Download URL: llm-guard-0.3.0.tar.gz
  • Upload date:
  • Size: 62.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.3

File hashes

Hashes for llm-guard-0.3.0.tar.gz
Algorithm Hash digest
SHA256 13cd1db1fedc8f2fc25554dde312dd72b609dfce95c113e7f5de952945da64bc
MD5 21d406a8b340765bf76196585ac9d103
BLAKE2b-256 309ea7dfe5629c5ffd606ef7390304d3bb6282718fcad7556b8a053e52346286

See more details on using hashes here.

File details

Details for the file llm_guard-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: llm_guard-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 131.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.3

File hashes

Hashes for llm_guard-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d8ab4838a3018f05cc67fde12188b0893388905dfd2092e1781abf523601b5ea
MD5 84e6412ec60ce400c2baf2f4a9a77c5f
BLAKE2b-256 ce0c65e729fb245c4ce9ed4c793a1715c05d039567ac52e7376ee2dc01164d85

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page