Skip to main content

tools for using huggingface/transformers more easily

Project description

Transformersx

🤗 Transformers is a great project for the Transformer architecture for NLP and NLG.

/🤗 Transformers (formerly known as pytorch-transformers and pytorch-pretrained-bert) provides state-of-the-art general-purpose architectures (BERT, GPT-2, RoBERTa, XLM, DistilBert, XLNet, T5, CTRL...) for Natural Language Understanding (NLU) and Natural Language Generation (NLG) with over thousands of pretrained models in 100+ languages and deep interoperability between PyTorch & TensorFlow 2.0./

The purpose of this project is to add more tools to easy to use the hugging/transformers library for NLP. And NLP task examples was refactored or added as well.

/**BTW, **/ Because it's hard to download the pretrained models from huggingface, especially in China, here tries to use a trick to solve this problem.

For personal, Aliyun is free to build the docker image, typically, the image building can use the overseas machine to build the docker image. So when the image is built by the overseas machine, it can download the pretrained models from huggingface fastly.

After the image is built, the image can be pulled from Aliyun fastly. And then the pretrained models can be take from the docker image.

TODO:

  • (1) Build a management tools for the models of transformers to categorize the models, and the features also should be including training, finetuning and predict
  • (2) Use the Streamlit to build the UI for the management tools

Project details


Release history Release notifications | RSS feed

This version

0.4.7

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai-transformersx-0.4.7.tar.gz (49.4 kB view details)

Uploaded Source

Built Distribution

ai_transformersx-0.4.7-py2.py3-none-any.whl (100.8 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file ai-transformersx-0.4.7.tar.gz.

File metadata

  • Download URL: ai-transformersx-0.4.7.tar.gz
  • Upload date:
  • Size: 49.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.4.0.post20200518 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.6.0

File hashes

Hashes for ai-transformersx-0.4.7.tar.gz
Algorithm Hash digest
SHA256 2a669f6703669d4ac4859489aff7315a3ac469dd022ae939db6fec6887801510
MD5 543d454529f14493569d487ab668a81b
BLAKE2b-256 5810b003908dc51268264b8741882b23b9c2703eb3af55317a64923648e30fbf

See more details on using hashes here.

File details

Details for the file ai_transformersx-0.4.7-py2.py3-none-any.whl.

File metadata

  • Download URL: ai_transformersx-0.4.7-py2.py3-none-any.whl
  • Upload date:
  • Size: 100.8 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.4.0.post20200518 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.6.0

File hashes

Hashes for ai_transformersx-0.4.7-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 008227adf4ef585a03a2f51f000507fe9513f3a0e1024c5fca1727885c1f48e2
MD5 a524aa855bf8d289696775936951aabb
BLAKE2b-256 c55c65b418a2fdc0826bc95e3578c02e7cb3eb480e7f3b16aefd8d7b77fb20a0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page