Skip to main content

tools for using huggingface/transformers more easily

Project description

Transformersx

🤗 Transformers is a great project for the Transformer architecture for NLP and NLG.

/🤗 Transformers (formerly known as pytorch-transformers and pytorch-pretrained-bert) provides state-of-the-art general-purpose architectures (BERT, GPT-2, RoBERTa, XLM, DistilBert, XLNet, T5, CTRL...) for Natural Language Understanding (NLU) and Natural Language Generation (NLG) with over thousands of pretrained models in 100+ languages and deep interoperability between PyTorch & TensorFlow 2.0./

The purpose of this project is to add more tools to easy to use the hugging/transformers library for NLP. And NLP task examples was refactored or added as well.

/**BTW, **/ Because it's hard to download the pretrained models from huggingface, especially in China, here tries to use a trick to solve this problem.

For personal, Aliyun is free to build the docker image, typically, the image building can use the overseas machine to build the docker image. So when the image is built by the overseas machine, it can download the pretrained models from huggingface fastly.

After the image is built, the image can be pulled from Aliyun fastly. And then the pretrained models can be take from the docker image.

TODO:

  • (1) Build a management tools for the models of transformers to categorize the models, and the features also should be including training, finetuning and predict
  • (2) Use the Streamlit to build the UI for the management tools

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai-transformersx-0.4.25.tar.gz (35.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ai_transformersx-0.4.25-py2.py3-none-any.whl (108.4 kB view details)

Uploaded Python 2Python 3

File details

Details for the file ai-transformersx-0.4.25.tar.gz.

File metadata

  • Download URL: ai-transformersx-0.4.25.tar.gz
  • Upload date:
  • Size: 35.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.4.0.post20200518 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.6.0

File hashes

Hashes for ai-transformersx-0.4.25.tar.gz
Algorithm Hash digest
SHA256 2a37aaf1394f1dfd5c163b19c7798afd5383dcb3a44951185f5e3531fe4eefae
MD5 6f724e229f1b5e1fb3dc805152e79e9a
BLAKE2b-256 b992a3523f0436d25befb68f10fd3a6129337652d757b86f40449b952395f454

See more details on using hashes here.

File details

Details for the file ai_transformersx-0.4.25-py2.py3-none-any.whl.

File metadata

  • Download URL: ai_transformersx-0.4.25-py2.py3-none-any.whl
  • Upload date:
  • Size: 108.4 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.4.0.post20200518 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.6.0

File hashes

Hashes for ai_transformersx-0.4.25-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 636c67e2f460c28c58600bb15ba124f99c4d94a2fff6b724f9c9c910459c0edb
MD5 af87c76509097744fda0e03c892d6f9b
BLAKE2b-256 a00975caada90d7a361ecaf8094f94d4c991c1a9090a70e4900d29cf29757a20

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page