Skip to main content

LLM-Guard is a comprehensive tool designed to fortify the security of Large Language Models (LLMs). By offering sanitization, detection of harmful language, prevention of data leakage, and resistance against prompt injection attacks, LLM-Guard ensures that your interactions with LLMs remain safe and secure.

Project description

LLM Guard - The Security Toolkit for LLM Interactions

LLM Guard by Laiyer.ai is a comprehensive tool designed to fortify the security of Large Language Models (LLMs).

Documentation | Demo | Changelog

MIT license Code style: black PyPI - Python Version Downloads Downloads Twitter

Production Support / Help for companies

We're eager to provide personalized assistance when deploying your LLM Guard to a production environment.

What is LLM Guard?

LLM-Guard

By offering sanitization, detection of harmful language, prevention of data leakage, and resistance against prompt injection attacks, LLM-Guard ensures that your interactions with LLMs remain safe and secure.

Installation

Begin your journey with LLM Guard by downloading the package:

pip install llm-guard

Getting Started

Important Notes:

  • LLM Guard is designed for easy integration and deployment in production environments. While it's ready to use out-of-the-box, please be informed that we're constantly improving and updating the repository.
  • Base functionality requires a limited number of libraries. As you explore more advanced features, necessary libraries will be automatically installed.
  • Ensure you're using Python version 3.8.1 or higher. Confirm with: python --version.
  • Library installation issues? Consider upgrading pip: python -m pip install --upgrade pip.

Examples:

Supported scanners

Prompt scanners

Output scanners

Roadmap

You can find our roadmap here. Please don't hesitate to contribute or create issues, it helps us improve LLM Guard!

Contributing

Got ideas, feedback, or wish to contribute? We'd love to hear from you! Email us.

For detailed guidelines on contributions, kindly refer to our contribution guide.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm-guard-0.3.1.tar.gz (49.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_guard-0.3.1-py3-none-any.whl (103.0 kB view details)

Uploaded Python 3

File details

Details for the file llm-guard-0.3.1.tar.gz.

File metadata

  • Download URL: llm-guard-0.3.1.tar.gz
  • Upload date:
  • Size: 49.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for llm-guard-0.3.1.tar.gz
Algorithm Hash digest
SHA256 a830386603c8ed555edcdbb6ba1ffec0077e40a0d030835482069186dfba6e1b
MD5 dc6ba087ff7b194f6247eabb3c5921a3
BLAKE2b-256 758125dc4018c91a0e18d91bcd43c200bd21af9b1aee31d43e37ad3607ac7604

See more details on using hashes here.

File details

Details for the file llm_guard-0.3.1-py3-none-any.whl.

File metadata

  • Download URL: llm_guard-0.3.1-py3-none-any.whl
  • Upload date:
  • Size: 103.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.5

File hashes

Hashes for llm_guard-0.3.1-py3-none-any.whl
Algorithm Hash digest
SHA256 486306dba9e051fcbd5a38343ffc7d45a5af53083db11669e0a8f92a70a54bf5
MD5 4dd00805f1507e08cca18218cf5138af
BLAKE2b-256 05108b0da620655c6aa15141ee8a65ecb283101fa506d97a59c6834cb0a632f7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page