An extension for using llama-guard with aiconfig
Project description
Using LLama-Guarde for model parsing with AIConfig
We are prioritizing the Python library Usage
Part 1: Update and test this extention
If you are not testing locally (just using the published extension), ignore this and go to Part 2
- From the
aiconfig/HuggingFaceTransformers
, run this command:pip install build && cd python && python -m build && pip install dist/*.whl
- After you're done testing, be sure to delete the generated folder(s) in the
aiconfig/HuggingFaceTransformers
dir. It'll probalby look something likepython/dist
andpython/<package_name>.egg-info
Part 2: Importing and using this extension
- Import whatever outputs pip gives from last command. For now it's
import text-generation
but this may change in the future - In code, add all the relevant model parsers that you want to use from this extension to the registry:
ModelParserRegistry.register_model_parser(HuggingFaceTextGenerationTransformer())
. You can read the docstrings underModelParserRegistry
class for more info
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for aiconfig_extension_llama_guard-0.0.1.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | c78d3c7ffa1c47eef0a0ac9b2d9855eb0ca3f60c88721ed618b6402dc3aa85bc |
|
MD5 | f95689bee19630e558154235b7084f8f |
|
BLAKE2b-256 | 055ee3f153f5774645119fc4bf846e3fe2b283e2d56b29727eb54f7a84bef8f2 |
Close
Hashes for aiconfig_extension_llama_guard-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 754dfca6f2219662035d91a1b297db5c89df6b9f52ac53b03e4233ae458b3671 |
|
MD5 | ca9c01c74cb7a12d8e1ea11930aaa536 |
|
BLAKE2b-256 | f64f4e6dae6ce8ae8df425e7c9c93c6a392626893d6e3a935024fa8cee9b22a4 |