A package that downloads a model from the HuggingFace Model Hub.
Project description
HFLoader - Hugging Face Model Loader
This package provides the user with one method that returns both a tokenizer
and a model
when loading HuggingFace Models
, without the need for knowing which AutoModel
to use.
This package utilizes the transformers
library to load a tokenizer and model, without having to know the AutoModel. This means that with one command, you can easily load the tokenizer and model of a given model on HuggingFace Model Hub. This can then be easily fed into the pipeline
function of transformers
.
Installation
The package can be installed with the following command:
pip install hfloader
How to Use
Here is a bit of code you can reference to see how to use the package.
import hfloader as hfl
huggingface_model = "cardiffnlp/twitter-roberta-base-sentiment"
tokenizer, model = hfl.load_model(huggingface_model)
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer, torch_dtype=torch.bfloat16, device_map="auto")
The return of the command load_model()
is going to be a tokenizer and a model, which can then be used in the pipeline method provided by transformers. It does so without the need to know which AutoModel to use, which can prove to be a hassle when trying out different models.
Requirements
pip install transformers
Notes
I don't have plans to upkeep this project unless it necessitates it. I was able to achieve the goal I had set out when developing the package.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for hfdownloader-1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 760b3154e5d465ef322acedc32a30ee3b96ef419084804b782a1dff85c646780 |
|
MD5 | f703f0fd7c01189f2a94023c37adbc8a |
|
BLAKE2b-256 | 9fd3ea43baee8524514f15b2a04183640e6f4de65c5e8e63347c6e265de03769 |