Skip to main content

An extension for using Hugging Face tasks to parse models for AIConfig.

Project description

This extension contains AIConfig model parsers with two main subfolders:

  1. local_inference: Loads models onto your machine and uses Hugging Face transformers and diffusors locally.
  2. remote_inference_client: Uses Hugging Face's InferenceClient API to connect to models remotely

Usage

Part 1: Update and test this extension

If you are not testing and developing locally (just using the published extension), ignore this and go to Part 2

  1. From the aiconfig/extensions/HuggingFace, run this command: pip3 install build && cd python && python -m build && pip3 install dist/*.whl
  2. Link your local dev environment to the current dir: pip3 install -e .. Afterwards if you do pip3 list | grep aiconfig, you should see this linked to your local path. If you ever wish to use the published extension, you will need to first remove the extension: pip3 uninstall aiconfig-extension-hugging-face && pip3 install aiconfig-extension-hugging-face
  3. After you're done testing, be sure to delete the generated folder(s) in the aiconfig/HuggingFace dir. It'll probably look something like python/dist and python/<package_name>.egg-info

Part 2: Importing and using this extension

pip3 install aiconfig-extension-hugging-face

  1. Import the library to your code: from aiconfig_extension_hugging_face import <EXTENSION>.
  2. Import the AIConfig model registery: from aiconfig import ModelRegistryParser
  3. In code, add all the relevant model parser objects that you want to use from this extension to the registry. Ex: ModelParserRegistry.register_model_parser(HuggingFaceTextGenerationTransformer()). You can read the docstrings under ModelParserRegistry class for more info
  4. In your AIConfig, add a field model_parsers with the model you want to use and the id of the extension you want to use: . Ex: https://github.com/lastmile-ai/aiconfig/blob/f1840995b7a12acba371a59ac3b8c69b3962fc68/cookbooks/Getting-Started/travel.aiconfig.json#L19-L22
  5. Now whenever you call aiconfig.run() these model parsers will be loaded and available!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aiconfig_extension_hugging_face-0.0.8.tar.gz (23.3 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file aiconfig_extension_hugging_face-0.0.8.tar.gz.

File metadata

File hashes

Hashes for aiconfig_extension_hugging_face-0.0.8.tar.gz
Algorithm Hash digest
SHA256 ab50a3958c57c47b83f1d7e3f4501ff20183bb2d7573657d93ae879cdc50e7ca
MD5 52a64f6018f1a9488bdd22b278ec621f
BLAKE2b-256 57d2082b6fc02d30249bd222df95f38ede48e24be487d55ee3b968e437eb2c70

See more details on using hashes here.

File details

Details for the file aiconfig_extension_hugging_face-0.0.8-py3-none-any.whl.

File metadata

File hashes

Hashes for aiconfig_extension_hugging_face-0.0.8-py3-none-any.whl
Algorithm Hash digest
SHA256 bb8812271c16a12672c87b0aaecec44c32db6905ce8f162a2805b2a25fb82cdd
MD5 e7c3db1b855b6850aab725d50a45e560
BLAKE2b-256 38a56ef0c53bc8158e287ff1347d272905450747cc5b20d1db464630cd2caa58

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page