Skip to main content

An extension for using Hugging Face tasks to parse models for AIConfig.

Project description

This extension contains AIConfig model parsers with two main subfolders:

  1. local_inference: Loads models onto your machine and uses Hugging Face transformers and diffusors locally.
  2. remote_inference_client: Uses Hugging Face's InferenceClient API to connect to models remotely

Usage

Part 1: Update and test this extension

If you are not testing and developing locally (just using the published extension), ignore this and go to Part 2

  1. From the aiconfig/extensions/HuggingFace, run this command: pip3 install build && cd python && python -m build && pip3 install dist/*.whl
  2. Link your local dev environment to the current dir: pip3 install -e .. Afterwards if you do pip3 list | grep aiconfig, you should see this linked to your local path. If you ever wish to use the published extension, you will need to first remove the extension: pip3 uninstall aiconfig-extension-hugging-face && pip3 install aiconfig-extension-hugging-face
  3. After you're done testing, be sure to delete the generated folder(s) in the aiconfig/HuggingFace dir. It'll probably look something like python/dist and python/<package_name>.egg-info

Part 2: Importing and using this extension

pip3 install aiconfig-extension-hugging-face

  1. Import the library to your code: from aiconfig_extension_hugging_face import <EXTENSION>.
  2. Import the AIConfig model registery: from aiconfig import ModelRegistryParser
  3. In code, add all the relevant model parser objects that you want to use from this extension to the registry. Ex: ModelParserRegistry.register_model_parser(HuggingFaceTextGenerationTransformer()). You can read the docstrings under ModelParserRegistry class for more info
  4. In your AIConfig, add a field model_parsers with the model you want to use and the id of the extension you want to use: . Ex: https://github.com/lastmile-ai/aiconfig/blob/f1840995b7a12acba371a59ac3b8c69b3962fc68/cookbooks/Getting-Started/travel.aiconfig.json#L19-L22
  5. Now whenever you call aiconfig.run() these model parsers will be loaded and available!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aiconfig_extension_hugging_face-0.0.6.tar.gz (23.3 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file aiconfig_extension_hugging_face-0.0.6.tar.gz.

File metadata

File hashes

Hashes for aiconfig_extension_hugging_face-0.0.6.tar.gz
Algorithm Hash digest
SHA256 e32e63e7eca9290c9b457552c6c29acd356aab59e16ea9bc1c73b73e7420499b
MD5 8e4ef1c8be70f85a90b7c79d6c122a85
BLAKE2b-256 1b28aa196cb1ec70d0a92bee64b2423a2afb54030cf98e3eabf6bade59f9ff19

See more details on using hashes here.

File details

Details for the file aiconfig_extension_hugging_face-0.0.6-py3-none-any.whl.

File metadata

File hashes

Hashes for aiconfig_extension_hugging_face-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 ad462d941cf743260ec22f4a37cf2fec7ce941d20195fcd74b0159b857bf4b20
MD5 30aeb9cb7bde46f63d28aa92d2ded6a3
BLAKE2b-256 11b4ab771ffbcd3ca3578596e461cf3d4119b1d2f5c94836f3dbf03d7e148051

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page