Skip to main content

An extension for using Hugging Face tasks to parse models for AIConfig.

Project description

This extension contains AIConfig model parsers with two main subfolders:

  1. local_inference: Loads models onto your machine and uses Hugging Face transformers and diffusors locally.
  2. remote_inference_client: Uses Hugging Face's InferenceClient API to connect to models remotely

Usage

Part 1: Update and test this extension

If you are not testing and developing locally (just using the published extension), ignore this and go to Part 2

  1. From the aiconfig/extensions/HuggingFace, run this command: pip3 install build && cd python && python -m build && pip3 install dist/*.whl
  2. Link your local dev environment to the current dir: pip3 install -e .. Afterwards if you do pip3 list | grep aiconfig, you should see this linked to your local path. If you ever wish to use the published extension, you will need to first remove the extension: pip3 uninstall aiconfig-extension-hugging-face && pip3 install aiconfig-extension-hugging-face
  3. After you're done testing, be sure to delete the generated folder(s) in the aiconfig/HuggingFace dir. It'll probably look something like python/dist and python/<package_name>.egg-info

Part 2: Importing and using this extension

pip3 install aiconfig-extension-hugging-face

  1. Import the library to your code: from aiconfig_extension_hugging_face import <EXTENSION>.
  2. Import the AIConfig model registery: from aiconfig import ModelRegistryParser
  3. In code, add all the relevant model parser objects that you want to use from this extension to the registry. Ex: ModelParserRegistry.register_model_parser(HuggingFaceTextGenerationTransformer()). You can read the docstrings under ModelParserRegistry class for more info
  4. In your AIConfig, add a field model_parsers with the model you want to use and the id of the extension you want to use: . Ex: https://github.com/lastmile-ai/aiconfig/blob/f1840995b7a12acba371a59ac3b8c69b3962fc68/cookbooks/Getting-Started/travel.aiconfig.json#L19-L22
  5. Now whenever you call aiconfig.run() these model parsers will be loaded and available!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aiconfig_extension_hugging_face-0.0.11.tar.gz (24.2 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file aiconfig_extension_hugging_face-0.0.11.tar.gz.

File metadata

File hashes

Hashes for aiconfig_extension_hugging_face-0.0.11.tar.gz
Algorithm Hash digest
SHA256 e33cc17fcb738efc88d301d98a93707657b55362a8cb2f872f5e3436e4df06af
MD5 6d226679b53319e13fa30cd51cb0e04d
BLAKE2b-256 411779fc36d91a036643f4281b7598870fcbe600717a3dcfcc77d3d8e0b960ef

See more details on using hashes here.

File details

Details for the file aiconfig_extension_hugging_face-0.0.11-py3-none-any.whl.

File metadata

File hashes

Hashes for aiconfig_extension_hugging_face-0.0.11-py3-none-any.whl
Algorithm Hash digest
SHA256 935f4182097163af590a90269296d979ccccb84c13f262737b665b822a6fcf5a
MD5 965f91d221f8998f426e7b9e9c85fbff
BLAKE2b-256 3067ac2b7cbe76d3540c1cea95c8ad1f29d5265e955d940acc061e42f0c12ca6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page