An extension for using LLama Guard with aiconfig
Project description
LLama Guard with AIConfig
LLama Guard is a 7b model released by Meta. This extension allows you to use it with AIConfig.
LLaMA Guard allows you to define your own “safety taxonomy” — custom policies to determine which interactions are safe vs. unsafe between humans (prompts) and AI models (responses). What makes this cool is that it allows you to enforce your own policies ON TOP of the standard guardrails that a model ships with (instead of merely overriding them).
[!NOTE] This extension also loads the entire model into memory.
Part 1: Installating, Importing, and using this extension
- Install this module: run
pip3 install aiconfig_extension_llama_guard
in terminal - Add these lines to your code:
from aiconfig_extension_llama_guard import LLamageGuardParser
from aiconfig.registry import ModelParserRegistry
- In code, construct and load the model parser that to from this extension to the registry:
ModelParserRegistry.register_model_parser(LLamageGuard())
. You can read the docstrings underModelParserRegistry
class for more info o nwhat this does. - Use the
LLamageGuard
model parser however you please. Check out our tutorial to get started (video walkthrough, Jupyter notebook) You can watch our video tutorial or check our Jupyter notebook tuto
Part 2: Updating & Developing this extension
If you are not developing this extension locally (just using the published extension), feel free to ignore this part
- Navigate to
extensions/LLama-Guard/python
and run this command:pip3 install -e .
(this creates a local copy of the python module which is linked to this directory) - Edit and test the extension as you please. Feel free to submit a push request on GitHub!
- After you're done testing, be sure to uninstall the local link to this directory if you ever want to use the published version:
pip3 uninstall aiconfig_extension_llama_guard
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file aiconfig_extension_llama_guard-0.0.3.tar.gz
.
File metadata
- Download URL: aiconfig_extension_llama_guard-0.0.3.tar.gz
- Upload date:
- Size: 6.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d2d3c29a4496ddb405eaa506aaba12602de87a6c7d89216d314e2779ccc56b21 |
|
MD5 | 6a2c29369bf2134a83d23745769dc65d |
|
BLAKE2b-256 | ba84f86b4f99f5d6bdfb82848f3cb047ab6ce2e2edcbb246d463ca1389b73887 |
File details
Details for the file aiconfig_extension_llama_guard-0.0.3-py3-none-any.whl
.
File metadata
- Download URL: aiconfig_extension_llama_guard-0.0.3-py3-none-any.whl
- Upload date:
- Size: 6.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ffa31aa79c8b1ccfa72e7db1aeba24a0ebd83898f076e6c77c5863ba01e3d924 |
|
MD5 | 4dd0d389eb47d9669910920448f07c8c |
|
BLAKE2b-256 | 0ec9ccd5d891afdda54c355d0817d7003472db90b253cb9c11d7728f47153319 |