Skip to main content

An extension for using Hugging Face tasks to parse models for AIConfig.

Project description

This extension contains AIConfig model parsers with two main subfolders:

  1. local_inference: Loads models onto your machine and uses Hugging Face transformers and diffusors locally.
  2. remote_inference_client: Uses Hugging Face's InferenceClient API to connect to models remotely

Usage

Part 1: Update and test this extension

If you are not testing and developing locally (just using the published extension), ignore this and go to Part 2

  1. From the aiconfig/extensions/HuggingFace, run this command: pip3 install build && cd python && python -m build && pip3 install dist/*.whl
  2. Link your local dev environment to the current dir: pip3 install -e .. Afterwards if you do pip3 list | grep aiconfig, you should see this linked to your local path. If you ever wish to use the published extension, you will need to first remove the extension: pip3 uninstall aiconfig-extension-hugging-face && pip3 install aiconfig-extension-hugging-face
  3. After you're done testing, be sure to delete the generated folder(s) in the aiconfig/HuggingFace dir. It'll probably look something like python/dist and python/<package_name>.egg-info

Part 2: Importing and using this extension

pip3 install aiconfig-extension-hugging-face

  1. Import the library to your code: from aiconfig_extension_hugging_face import <EXTENSION>.
  2. Import the AIConfig model registery: from aiconfig import ModelRegistryParser
  3. In code, add all the relevant model parser objects that you want to use from this extension to the registry. Ex: ModelParserRegistry.register_model_parser(HuggingFaceTextGenerationTransformer()). You can read the docstrings under ModelParserRegistry class for more info
  4. In your AIConfig, add a field model_parsers with the model you want to use and the id of the extension you want to use: . Ex: https://github.com/lastmile-ai/aiconfig/blob/f1840995b7a12acba371a59ac3b8c69b3962fc68/cookbooks/Getting-Started/travel.aiconfig.json#L19-L22
  5. Now whenever you call aiconfig.run() these model parsers will be loaded and available!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aiconfig_extension_hugging_face-0.0.12.tar.gz (24.3 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file aiconfig_extension_hugging_face-0.0.12.tar.gz.

File metadata

File hashes

Hashes for aiconfig_extension_hugging_face-0.0.12.tar.gz
Algorithm Hash digest
SHA256 f32102bcb5d0af1da952cc70a0e154aaae54c8d8ef9c0e3ab5418ac8aba4e952
MD5 3488060db4d31ff3348efa32b599090c
BLAKE2b-256 74703fb64ad747426e3397d6fa03a7ab7f785ab368e0af9ab453dab0c5685af0

See more details on using hashes here.

File details

Details for the file aiconfig_extension_hugging_face-0.0.12-py3-none-any.whl.

File metadata

File hashes

Hashes for aiconfig_extension_hugging_face-0.0.12-py3-none-any.whl
Algorithm Hash digest
SHA256 203d089fe13afe1aec697b43dd0583edbeb4489e34eaa833c7ff9af1c27f8cfc
MD5 9a5d74146735c79523925863dd79eb94
BLAKE2b-256 f63e558ffce60ba191415df479793ce189f0da7b469a9661c321fd2deae2b29a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page