Data set loading and annotation facilities for the Simple Annotation Framework
Project description
SAF-Datasets
Dataset loading and annotation facilities for the Simple Annotation Framework
The saf-datasets library provides easy access to Natural Language Processing (NLP) datasets, and tools to facilitate annotation at document, sentence and token levels.
It is being developed to address a need for flexibility in manipulating NLP annotations that is not entirely covered by popular dataset libraries, such as HuggingFace Datasets and torch Datasets, Namely:
- Including and modifying annotations on existing datasets.
- Standardized API.
- Support for complex and multi-level annotations.
saf-datasets is built upon the Simple Annotation Framework (SAF) library, which provides its data model and API.
It also provides annotator classes to automatically label existing and new datasets.
Installation
To install, you can use pip:
pip install saf-datasets
Usage
Loading datasets
from saf_datasets import STSBDataSet
dataset = STSBDataSet()
print(len(dataset)) # Size of the dataset
# 17256
print(dataset[0].surface) # First sentence in the dataset
# A plane is taking off
print([token.surface for token in dataset[0].tokens]) # Tokens (SpaCy) of the first sentence.
# ['A', 'plane', 'is', 'taking', 'off', '.']
print(dataset[0].annotations) # Annotations for the first sentence
# {'split': 'train', 'genre': 'main-captions', 'dataset': 'MSRvid', 'year': '2012test', 'sid': '0001', 'score': '5.000', 'id': 0}
# There are no token annotations in this dataset
print([(tok.surface, tok.annotations) for tok in dataset[0].tokens])
# [('A', {}), ('plane', {}), ('is', {}), ('taking', {}), ('off', {}), ('.', {})]
Available datasets: AllNLI, CODWOE, CPAE, EntailmentBank, STSB, Wiktionary, WordNet (Filtered).
Annotating datasets
from saf_datasets import STSBDataSet
from saf_datasets.annotators import SpacyAnnotator
dataset = STSBDataSet()
annotator = SpacyAnnotator() # Needs spacy and en_core_web_sm to be installed.
annotator.annotate(dataset)
# Now tokens are annotated
for tok in dataset[0].tokens:
print(tok.surface, tok.annotations)
# A {'pos': 'DET', 'lemma': 'a', 'dep': 'det', 'ctag': 'DT'}
# plane {'pos': 'NOUN', 'lemma': 'plane', 'dep': 'nsubj', 'ctag': 'NN'}
# is {'pos': 'AUX', 'lemma': 'be', 'dep': 'aux', 'ctag': 'VBZ'}
# taking {'pos': 'VERB', 'lemma': 'take', 'dep': 'ROOT', 'ctag': 'VBG'}
# off {'pos': 'ADP', 'lemma': 'off', 'dep': 'prt', 'ctag': 'RP'}
# . {'pos': 'PUNCT', 'lemma': '.', 'dep': 'punct', 'ctag': '.'}
Using with other libraries
saf-datasets provides wrappers for using the datasets with libraries expecting HF or torch datasets:
from saf_datasets import CPAEDataSet
from saf_datasets.wrappers.torch import TokenizedDataSet
from transformers import AutoTokenizer
dataset = CPAEDataSet()
tokenizer = AutoTokenizer.from_pretrained("gpt2", padding_side="left", add_prefix_space=True)
tok_ds = TokenizedDataSet(dataset, tokenizer, max_len=128, one_hot=False)
print(tok_ds[:10])
# tensor([[50256, 50256, 50256, ..., 2263, 572, 13],
# [50256, 50256, 50256, ..., 2263, 572, 13],
# [50256, 50256, 50256, ..., 781, 1133, 13],
# ...,
# [50256, 50256, 50256, ..., 2712, 19780, 13],
# [50256, 50256, 50256, ..., 2685, 78, 13],
# [50256, 50256, 50256, ..., 2685, 78, 13]])
print(tok_ds[:10].shape)
# torch.Size([10, 128])
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file saf_datasets-0.6.4.tar.gz
.
File metadata
- Download URL: saf_datasets-0.6.4.tar.gz
- Upload date:
- Size: 29.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.11.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 24229a8f8a49c5a5d7714537b1c9a91c357097e7173b471027de1625be6eec1b |
|
MD5 | e0d60bbe0394bce42f2d42b1572590a9 |
|
BLAKE2b-256 | 9541c3f9b6472fe556c9eb9dc9a7a1c17a4fcd3d921d32b6fa6d0c908f455bd1 |
File details
Details for the file saf_datasets-0.6.4-py3-none-any.whl
.
File metadata
- Download URL: saf_datasets-0.6.4-py3-none-any.whl
- Upload date:
- Size: 34.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.11.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 33b5bb40bf0f1a9be03c66b12b2a5f9afb70e753159b1109f6d4dc9b1aab2564 |
|
MD5 | b28d87692464ef8395f445738fba9572 |
|
BLAKE2b-256 | e1bd39b824cd2a26b317074443c765e1e47ebd88f640242e3dae1cfefd625883 |