An extension for using llama-guard with aiconfig
Project description
Using LLama-Guarde for model parsing with AIConfig
We are prioritizing the Python library Usage
Part 1: Update and test this extention
If you are not testing locally (just using the published extension), ignore this and go to Part 2
- From the
aiconfig/HuggingFaceTransformers
, run this command:pip install build && cd python && python -m build && pip install dist/*.whl
- After you're done testing, be sure to delete the generated folder(s) in the
aiconfig/HuggingFaceTransformers
dir. It'll probalby look something likepython/dist
andpython/<package_name>.egg-info
Part 2: Importing and using this extension
- Import whatever outputs pip gives from last command. For now it's
import text-generation
but this may change in the future - In code, add all the relevant model parsers that you want to use from this extension to the registry:
ModelParserRegistry.register_model_parser(HuggingFaceTextGenerationTransformer())
. You can read the docstrings underModelParserRegistry
class for more info
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Close
Hashes for aiconfig_extension_gemini-0.0.1.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | f3c909faa57de75c385fcbfb0eecd82da1e0734931b046eff42c5dc47967b43e |
|
MD5 | f5cf37ea4827c2787f5cdd12a5aa6c1f |
|
BLAKE2b-256 | efc53fa5c8374356f49e638f9c54fd1bb8422e6244d54703b2b2a13aade8bdae |
Close
Hashes for aiconfig_extension_gemini-0.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | e4d837c681784f8bad0f337b9d9071af217ffe73e32d12f9a61a3cd7e7f49d58 |
|
MD5 | ee0da06d558a1daf8738e6c3edf47d96 |
|
BLAKE2b-256 | 6e4bfcbae9a83529a02d2af44ac98535cf67a1a3244d7e9563a1cc73a0f6b8b6 |