Utilities for finetuning BERT-like models
Project description
bert_experimental
Contains code and supplementary materials for a series of Medium articles about the BERT model.
pretraining: https://towardsdatascience.com/pre-training-bert-from-scratch-with-cloud-tpu-6e2f71028379
feature_extraction: https://medium.com/@gaphex/building-a-search-engine-with-bert-and-tensorflow-c6fdc0186c8a
finetuning: https://medium.com/@gaphex/fine-tuning-bert-with-keras-and-tf-module-ed24ea91cff2
representation: https://towardsdatascience.com/improving-sentence-embeddings-with-bert-and-representation-learning-dfba6b444f6b
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
No source distribution files available for this release.
See tutorial on generating distribution archives.
Built Distribution
Close
Hashes for bert_experimental-1.0.5-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7de340efc3639eace19b64a98a6db5f827a1dcd2616434630918e6c23e298fd1 |
|
MD5 | 429145eed013132016c8fab9a5e16c11 |
|
BLAKE2b-256 | 2e47659b902eb72104cb2418fff4cb340d646916bfe9e8b8e003ae96c21934d1 |