An extension for using llama-guard with aiconfig
Project description
Using LLama-Guarde for model parsing with AIConfig
We are prioritizing the Python library Usage
Part 1: Update and test this extention
If you are not testing locally (just using the published extension), ignore this and go to Part 2
- From the
aiconfig/HuggingFaceTransformers
, run this command:pip install build && cd python && python -m build && pip install dist/*.whl
- After you're done testing, be sure to delete the generated folder(s) in the
aiconfig/HuggingFaceTransformers
dir. It'll probalby look something likepython/dist
andpython/<package_name>.egg-info
Part 2: Importing and using this extension
- Import whatever outputs pip gives from last command. For now it's
import text-generation
but this may change in the future - In code, add all the relevant model parsers that you want to use from this extension to the registry:
ModelParserRegistry.register_model_parser(HuggingFaceTextGenerationTransformer())
. You can read the docstrings underModelParserRegistry
class for more info
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for aiconfig_extension_llama_guard-0.0.2.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0bd689b8c1f7072a4c9f5384665b5407ae32e48bba22fe46ccb79bda7fe2d105 |
|
MD5 | 82d8fde3c42ce5acac2b906760eb7109 |
|
BLAKE2b-256 | d591529b7ed1d8d494491cba608059ac0a57b93ee02c8ca576bb4b049adbfac0 |
Close
Hashes for aiconfig_extension_llama_guard-0.0.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | d5be4574e8651a728e7f8b6a75a6d9ac2498716c1de1b8e9279f5759eb2df7fc |
|
MD5 | 9aad1239ce90532ce1ffeb3c9b01566c |
|
BLAKE2b-256 | 243cfc12f9afbd8cc7a0bfc4476dffb86e2cf20f7e2ff13072d639ffaa4176ef |