Easy-to-use text representations extraction library based on the Transformers library.
Project description
Simple Representations
This library is based on the Transformers library by HuggingFace. Using this library, you can quickly extract text representations from Transformer models. Only two lines of code are needed to initialize the required model and extract the text representations from it.
Table of contents
Installation
This repository is tested on Python 3.6.8 and PyTorch 1.2.0
With pip
First you need to install PyTorch. Please refer to PyTorch installation page regarding the specific install command for your platform.
When PyTorch has been installed, Simple Representation can be installed using pip as follows:
pip install simplerepresentation
From source
Here also, you first need to install PyTorch. Please refer to PyTorch installation page regarding the specific install command for your platform.
When PyTorch has been installed, you can install from source by cloning the repository and running:
pip install .
Usage
Minimal Start
The following example extracts the text representations from BERT Base Uncased
model for the sentences Hello Transformers!
and It's very simple.
.
from simplerepresentations import RepresentationModel
def load_data():
return ['Hello Transformers!', 'It\'s very simple.']
if __name__ == '__main__':
model_type = 'bert'
model_name = 'bert-base-uncased'
representation_model = RepresentationModel(
model_type=model_type,
model_name=model_name,
batch_size=32,
max_seq_length=10, # truncate sentences to be less than or equal to 10 tokens
combination_method='cat', # concatenate the last `last_hidden_to_use` hidden states
last_hidden_to_use=4 # use the last 4 hidden states to build tokens representations
)
text_a = load_data()
all_sentences_representations, all_tokens_representations = representation_model(text_a=text_a)
print(all_sentences_representations.shape) # (2, 768) => (number of sentences, hidden size)
print(all_tokens_representations.shape) # (2, 10, 3072) => (number of sentences, number of tokens, hidden size)
You can change the code in load_data
function to load your own data from any source you want (e.g. a CSV file).
Default Settings
The default settings for RepresentationModel
class are given below:
batch_size (32): integer
The batch size will be used while extracting representations.
max_seq_length (128): integer
Maximum sequence length the model will support.
last_hidden_to_use (1): integer
The number of the last hidden states that will be used to build the representations.
combination_method ('sum'): string ('sum', 'cat')
The method that will be used to combine the last_hidden_to_use
.
use_cuda (True): boolean
Whether to use CUDA
or not.
process_count (cpu_count() - 2 if cpu_count() > 2 else 1): integer
Number of CPU cores (processes) to use when converting examples to features. Default is (number of cores - 2) or 1 if (number of cores <= 2).
chunksize (500): integer
The number of chunks that the examples will be divided to when converting them to features.
Current Pretrained Models
You can find the complete list of the current pretrained models from Transformers library documentation.
Acknowledgements
None of this would have been possible without the hard work by the HuggingFace team in developing the Transformers library.
Also, a lot of ideas used in this repository inspired from the Simple Transformers library.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file simplerepresentations-0.0.2.tar.gz
.
File metadata
- Download URL: simplerepresentations-0.0.2.tar.gz
- Upload date:
- Size: 7.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.22.0 setuptools/42.0.2 requests-toolbelt/0.9.1 tqdm/4.40.1 CPython/3.7.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b8339ba899d8f8e0dbea87615ca1ffdf15703fe49b0d54131cc2a48bf19298e4 |
|
MD5 | b724c281ffaaf9bcc4b07d9a3c807bf4 |
|
BLAKE2b-256 | fa86a538ce39a1d4ed5883acfaaab3dc34e23a027b29d43eda7634d7863fa88e |