A TensorFlow 2.0 Keras implementation of BERT.
Project description
This repo contains a TensorFlow v2 Keras implementation of google-research/bert with support for loading the original pre-trained weights, and producing activations numerically identical to the one you get from the original model.
LICENSE
MIT. See License File.
Install
bert-for-tf2 is on the Python Package Index (PyPI):
pip install bert-for-tf2
Usage
TBD
Resources
BERT - BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
google-research/bert - the original BERT implementation
kpe/params-flow - utilities for reducing keras boilerplate code in custom layers
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
bert-for-tf2-0.1.1.tar.gz
(23.5 kB
view hashes)
Built Distribution
Close
Hashes for bert_for_tf2-0.1.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 6d85674dbf1082cdc813f779b40ac61f3d7cf0aaf1bcd0e329753ef8acd3e504 |
|
MD5 | d5ddbe4f48454269d0806467b2ae4f45 |
|
BLAKE2b-256 | cdf6a6c9612201de1e5600c4903bb700f2e83f869b06c722344ac4314b415d52 |