Skip to main content

An extension for using Hugging Face tasks to parse models for AIConfig.

Project description

This extension contains AIConfig model parsers with two main subfolders:

  1. local_inference: Loads models onto your machine and uses Hugging Face transformers and diffusors locally.
  2. remote_inference_client: Uses Hugging Face's InferenceClient API to connect to models remotely

Usage

Part 1: Update and test this extension

If you are not testing and developing locally (just using the published extension), ignore this and go to Part 2

  1. From the aiconfig/extensions/HuggingFace, run this command: pip3 install build && cd python && python -m build && pip3 install dist/*.whl
  2. Link your local dev environment to the current dir: pip3 install -e .. Afterwards if you do pip3 list | grep aiconfig, you should see this linked to your local path. If you ever wish to use the published extension, you will need to first remove the extension: pip3 uninstall aiconfig-extension-hugging-face && pip3 install aiconfig-extension-hugging-face
  3. After you're done testing, be sure to delete the generated folder(s) in the aiconfig/HuggingFace dir. It'll probably look something like python/dist and python/<package_name>.egg-info

Part 2: Importing and using this extension

pip3 install aiconfig-extension-hugging-face

  1. Import the library to your code: from aiconfig_extension_hugging_face import <EXTENSION>.
  2. Import the AIConfig model registery: from aiconfig import ModelRegistryParser
  3. In code, add all the relevant model parser objects that you want to use from this extension to the registry. Ex: ModelParserRegistry.register_model_parser(HuggingFaceTextGenerationTransformer()). You can read the docstrings under ModelParserRegistry class for more info
  4. In your AIConfig, add a field model_parsers with the model you want to use and the id of the extension you want to use: . Ex: https://github.com/lastmile-ai/aiconfig/blob/f1840995b7a12acba371a59ac3b8c69b3962fc68/cookbooks/Getting-Started/travel.aiconfig.json#L19-L22
  5. Now whenever you call aiconfig.run() these model parsers will be loaded and available!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aiconfig_extension_hugging_face-0.0.13.tar.gz (24.4 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file aiconfig_extension_hugging_face-0.0.13.tar.gz.

File metadata

File hashes

Hashes for aiconfig_extension_hugging_face-0.0.13.tar.gz
Algorithm Hash digest
SHA256 63bce1436683b28276885a52a51989c6219872287f1bdc16bcfc227c69e6bf9f
MD5 571421251baff27be78cde549f7c0083
BLAKE2b-256 e5f72e05fb8ddf44fbbc083da74888596d14684ab5abc3607a331892b1be6966

See more details on using hashes here.

File details

Details for the file aiconfig_extension_hugging_face-0.0.13-py3-none-any.whl.

File metadata

File hashes

Hashes for aiconfig_extension_hugging_face-0.0.13-py3-none-any.whl
Algorithm Hash digest
SHA256 c5acb889f2b0b65a95a8464d548e324e4b6c3216ee535d9447be79d2e7c7e414
MD5 b7b85c39478294c7566a45e26a8d5cf4
BLAKE2b-256 901acfdd951dc4f691d6f320eed41f3975fd5eabe52d9f460da3559ca9219326

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page