Skip to main content

tools for using huggingface/transformers more easily

Project description

Transformersx

🤗 Transformers is a great project for the Transformer architecture for NLP and NLG.

/🤗 Transformers (formerly known as pytorch-transformers and pytorch-pretrained-bert) provides state-of-the-art general-purpose architectures (BERT, GPT-2, RoBERTa, XLM, DistilBert, XLNet, T5, CTRL...) for Natural Language Understanding (NLU) and Natural Language Generation (NLG) with over thousands of pretrained models in 100+ languages and deep interoperability between PyTorch & TensorFlow 2.0./

The purpose of this project is to add more tools to easy to use the hugging/transformers library for NLP. And NLP task examples was refactored or added as well.

/**BTW, **/ Because it's hard to download the pretrained models from huggingface, especially in China, here tries to use a trick to solve this problem.

For personal, Aliyun is free to build the docker image, typically, the image building can use the overseas machine to build the docker image. So when the image is built by the overseas machine, it can download the pretrained models from huggingface fastly.

After the image is built, the image can be pulled from Aliyun fastly. And then the pretrained models can be take from the docker image.

TODO:

  • (1) Build a management tools for the models of transformers to categorize the models, and the features also should be including training, finetuning and predict
  • (2) Use the Streamlit to build the UI for the management tools

Project details


Release history Release notifications | RSS feed

This version

0.4.2

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai-transformersx-0.4.2.tar.gz (49.4 kB view details)

Uploaded Source

Built Distribution

ai_transformersx-0.4.2-py2.py3-none-any.whl (100.8 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file ai-transformersx-0.4.2.tar.gz.

File metadata

  • Download URL: ai-transformersx-0.4.2.tar.gz
  • Upload date:
  • Size: 49.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.4.0.post20200518 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.6.0

File hashes

Hashes for ai-transformersx-0.4.2.tar.gz
Algorithm Hash digest
SHA256 a6be18172db69a76fc16219ad63b89926f886dec71f745c8a2c7d4bc857b2375
MD5 7b4e8fe1560e98e8df800f6d36acefbd
BLAKE2b-256 40c5f07e3eb8a4c17e62bffdcabee3b5cae6b05e39f08051891cfcbc995e928d

See more details on using hashes here.

File details

Details for the file ai_transformersx-0.4.2-py2.py3-none-any.whl.

File metadata

  • Download URL: ai_transformersx-0.4.2-py2.py3-none-any.whl
  • Upload date:
  • Size: 100.8 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.4.0.post20200518 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.6.0

File hashes

Hashes for ai_transformersx-0.4.2-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 c1979a9c919c882d02909921d324526cf9d77fc1a9a8c647c79945addc41bac7
MD5 9a2fc0bb4655fdc26394eb743e791a84
BLAKE2b-256 88c7e688e4a2709ee25cd2cb0d503f95a8b0361004bb83503f3540649b18e21e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page