Skip to main content

BERT token level embedding with MxNet

Project description

Bert Embeddings

Build Status PyPI version

BERT, published by Google, is new way to obtain pre-trained language model word representation. Many NLP tasks are benefit from BERT to get the SOTA.

The goal of this project is to obtain the sentence and token embedding from BERT's pre-trained model. In this way, instead of building and do fine-tuning for an end-to-end NLP model, you can build your model by just utilizing the sentence or token embedding.

This project is implemented with @MXNet. Special thanks to @gluon-nlp team.

Install

pip install bert-embedding
pip install https://github.com/dmlc/gluon-nlp/tarball/master
# If you want to run on GPU machine, please install `mxnet-cu92`.
pip install mxnet-cu92

This project use API from gluonnlp==0.5.1, which hasn't been released yet. Once 0.5.1 is release, it is not necessary to install gluonnlp from source.

Usage

from bert_embedding import BertEmbedding

bert_abstract = """We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers.
 Unlike recent language representation models, BERT is designed to pre-train deep bidirectional representations by jointly conditioning on both left and right context in all layers.
 As a result, the pre-trained BERT representations can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks, such as question answering and language inference, without substantial task-specific architecture modifications. 
BERT is conceptually simple and empirically powerful. 
It obtains new state-of-the-art results on eleven natural language processing tasks, including pushing the GLUE benchmark to 80.4% (7.6% absolute improvement), MultiNLI accuracy to 86.7 (5.6% absolute improvement) and the SQuAD v1.1 question answering Test F1 to 93.2 (1.5% absolute improvement), outperforming human performance by 2.0%."""
sentences = bert_abstract.split('\n')
bert = BertEmbedding()
result = bert.embedding(sentences)

This result contains following three parts in a tuple

  • sentence embedding
  • tokens
  • tokens embedding

Below is the result from the demo code above:

result[0][0]
# array([-0.835946  , -0.4605566 , -0.95620036, ..., -0.95608854,
#       -0.6258104 ,  0.7697007 ], dtype=float32)
result[0][0].shape
# (768,)
result[0][1]
# ['we', 'introduce', 'a', 'new', 'language', 'representation', 'model', 'called', 'bert', ',', 'which', 'stands', 'for', 'bidirectional', 'encoder', 'representations', 'from', 'transformers']
len(result[0][1])
# 18
len(result[0][2])
# 18
result[0][2][0]
# array([ 0.4805648 ,  0.18369392, -0.28554988, ..., -0.01961522,
#        1.0207764 , -0.67167974], dtype=float32)
result[0][2][0].shape
# (768,)

Available pre-trained BERT models

book_corpus_wiki_en_uncased book_corpus_wiki_en_cased wiki_multilingual
bert_12_768_12
bert_24_1024_16 x x

Example of using the large pre-trained BERT model from Google

from bert_embedding.bert import BertEmbedding

bert = BertEmbedding(model='bert_24_1024_16', dataset_name='book_corpus_wiki_en_cased')

Source: gluonnlp

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bert_embedding-0.1.1.tar.gz (6.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

bert_embedding-0.1.1-py3-none-any.whl (10.5 kB view details)

Uploaded Python 3

File details

Details for the file bert_embedding-0.1.1.tar.gz.

File metadata

  • Download URL: bert_embedding-0.1.1.tar.gz
  • Upload date:
  • Size: 6.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.5.0.1 requests/2.18.4 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.0 CPython/3.5.6

File hashes

Hashes for bert_embedding-0.1.1.tar.gz
Algorithm Hash digest
SHA256 43ecc2b60fafcca46d237d84bff8fc97e578d11d3ee9b61e4ffabf8be7e1586a
MD5 c16dac225e46a37c710852aa1ef086fc
BLAKE2b-256 016f886bb46b6013a6a409d1ee088a36492881a3f4d04e828aa21e8e925a5364

See more details on using hashes here.

File details

Details for the file bert_embedding-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: bert_embedding-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 10.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.5.0.1 requests/2.18.4 setuptools/40.8.0 requests-toolbelt/0.9.1 tqdm/4.31.0 CPython/3.5.6

File hashes

Hashes for bert_embedding-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 3153099d62660613b212e870105e47636bcabf3a2128eb0158215ca7446be6eb
MD5 a1f1c64ad64b2fe94c33262cb0889b8f
BLAKE2b-256 2f1a091a641e70fdca256233f7929d2b378159871b54d8a0d1f33443b50df34e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page