A TensorFlow 2.0 Keras implementation of BERT.
Project description
This repo contains a TensorFlow v2 Keras implementation of google-research/bert with support for loading the original pre-trained weights, and producing activations numerically identical to the one you get from the original model.
LICENSE
MIT. See License File.
Install
bert-for-tf2 is on the Python Package Index (PyPI):
pip install bert-for-tf2
Usage
TBD
Resources
BERT - BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
google-research/bert - the original BERT implementation
kpe/params-flow - utilities for reducing keras boilerplate code in custom layers
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
bert-for-tf2-0.1.3.tar.gz
(24.2 kB
view hashes)