Skip to main content

An extension for using llama-guard with aiconfig

Project description

Using LLama-Guarde for model parsing with AIConfig

We are prioritizing the Python library Usage

Part 1: Update and test this extention

If you are not testing locally (just using the published extension), ignore this and go to Part 2

  1. From the aiconfig/HuggingFaceTransformers, run this command: pip install build && cd python && python -m build && pip install dist/*.whl
  2. After you're done testing, be sure to delete the generated folder(s) in the aiconfig/HuggingFaceTransformers dir. It'll probalby look something like python/dist and python/<package_name>.egg-info

Part 2: Importing and using this extension

  1. Import whatever outputs pip gives from last command. For now it's import text-generation but this may change in the future
  2. In code, add all the relevant model parsers that you want to use from this extension to the registry: ModelParserRegistry.register_model_parser(HuggingFaceTextGenerationTransformer()). You can read the docstrings under ModelParserRegistry class for more info

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aiconfig_extension_gemini-0.0.3.tar.gz (5.7 kB view hashes)

Uploaded Source

Built Distribution

aiconfig_extension_gemini-0.0.3-py3-none-any.whl (5.9 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page