Skip to main content

An extension for using LLama Guard with aiconfig

Project description

LLama Guard with AIConfig

LLama Guard is a 7b model released by Meta. This extension allows you to use it with AIConfig.

LLaMA Guard allows you to define your own “safety taxonomy” — custom policies to determine which interactions are safe vs. unsafe between humans (prompts) and AI models (responses). What makes this cool is that it allows you to enforce your own policies ON TOP of the standard guardrails that a model ships with (instead of merely overriding them).

[!NOTE] This extension also loads the entire model into memory.

Part 1: Installating, Importing, and using this extension

  1. Install this module: run pip3 install aiconfig_extension_llama_guard in terminal
  2. Add these lines to your code:
from aiconfig_extension_llama_guard import LLamageGuardParser
from aiconfig.registry import ModelParserRegistry
  1. In code, construct and load the model parser that to from this extension to the registry: ModelParserRegistry.register_model_parser(LLamageGuard()). You can read the docstrings under ModelParserRegistry class for more info o nwhat this does.
  2. Use the LLamageGuard model parser however you please. Check out our tutorial to get started (video walkthrough, Jupyter notebook) You can watch our video tutorial or check our Jupyter notebook tuto

Part 2: Updating & Developing this extension

If you are not developing this extension locally (just using the published extension), feel free to ignore this part

  1. Navigate to extensions/LLama-Guard/python and run this command: pip3 install -e . (this creates a local copy of the python module which is linked to this directory)
  2. Edit and test the extension as you please. Feel free to submit a push request on GitHub!
  3. After you're done testing, be sure to uninstall the local link to this directory if you ever want to use the published version: pip3 uninstall aiconfig_extension_llama_guard

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aiconfig_extension_llama_guard-0.0.3.tar.gz (6.5 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file aiconfig_extension_llama_guard-0.0.3.tar.gz.

File metadata

File hashes

Hashes for aiconfig_extension_llama_guard-0.0.3.tar.gz
Algorithm Hash digest
SHA256 d2d3c29a4496ddb405eaa506aaba12602de87a6c7d89216d314e2779ccc56b21
MD5 6a2c29369bf2134a83d23745769dc65d
BLAKE2b-256 ba84f86b4f99f5d6bdfb82848f3cb047ab6ce2e2edcbb246d463ca1389b73887

See more details on using hashes here.

File details

Details for the file aiconfig_extension_llama_guard-0.0.3-py3-none-any.whl.

File metadata

File hashes

Hashes for aiconfig_extension_llama_guard-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 ffa31aa79c8b1ccfa72e7db1aeba24a0ebd83898f076e6c77c5863ba01e3d924
MD5 4dd0d389eb47d9669910920448f07c8c
BLAKE2b-256 0ec9ccd5d891afdda54c355d0817d7003472db90b253cb9c11d7728f47153319

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page