Utilities for finetuning BERT-like models
Project description
bert_experimental
Contains code and supplementary materials for a series of Medium articles about the BERT model.
pretraining: https://towardsdatascience.com/pre-training-bert-from-scratch-with-cloud-tpu-6e2f71028379
feature_extraction: https://medium.com/@gaphex/building-a-search-engine-with-bert-and-tensorflow-c6fdc0186c8a
finetuning: https://medium.com/@gaphex/fine-tuning-bert-with-keras-and-tf-module-ed24ea91cff2
representation: https://towardsdatascience.com/improving-sentence-embeddings-with-bert-and-representation-learning-dfba6b444f6b
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
File details
Details for the file bert_experimental-1.0.5-py3-none-any.whl
.
File metadata
- Download URL: bert_experimental-1.0.5-py3-none-any.whl
- Upload date:
- Size: 29.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.2.0 pkginfo/1.6.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.6.8
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7de340efc3639eace19b64a98a6db5f827a1dcd2616434630918e6c23e298fd1 |
|
MD5 | 429145eed013132016c8fab9a5e16c11 |
|
BLAKE2b-256 | 2e47659b902eb72104cb2418fff4cb340d646916bfe9e8b8e003ae96c21934d1 |