Skip to main content

TFDS Datasets for sign language

Project description

Sign Language Datasets

This repository includes TFDS data loaders for sign language datasets.

Installation

From Source

pip install git+https://github.com/sign-language-processing/datasets.git

PyPi

pip install sign-language-datasets

Usage

We demonstrate a loading script for every dataset in examples/load.ipynb Open In Colab

Our config includes the option to choose the resolution and fps, for example:

import tensorflow_datasets as tfds
import sign_language_datasets.datasets
from sign_language_datasets.datasets.config import SignDatasetConfig

# Loading a dataset with default configuration
aslg_pc12 = tfds.load("aslg_pc12")

# Loading a dataset with custom configuration
config = SignDatasetConfig(name="videos_and_poses256x256:12",
                           version="3.0.0",  # Specific version
                           include_video=True,  # Download and load dataset videos
                           process_video=True,  # Process videos to tensors, or only save path to video
                           fps=12,  # Load videos at constant, 12 fps
                           resolution=(256, 256),  # Convert videos to a constant resolution, 256x256
                           include_pose="holistic")  # Download and load Holistic pose estimation
rwth_phoenix2014_t = tfds.load(name='rwth_phoenix2014_t', builder_kwargs=dict(config=config))

Datasets

Dataset Videos Poses Versions
aslg_pc12 N/A N/A 0.0.1
asl-lex No 2.0.0
rwth_phoenix2014_t Yes Holistic 3.0.0
autsl Yes OpenPose, Holistic 1.0.0
dgs_corpus Yes OpenPose, Holistic 3.0.0
dgs_types Yes 3.0.0
how2sign Yes OpenPose 1.0.0
sign2mint Yes 1.0.0
signtyp Links 1.0.0
swojs_glossario Yes 1.0.0
SignBank N/A 1.0.0
wlasl Failed OpenPose None
wmtslt Yes OpenPose, Holistic 1.2.0
signsuisse Yes 1.0.0
msasl None
Video-Based CSL None
RVL-SLLL ASL None
ngt_corpus Yes 3.0.0
bsl_corpus No No 3.0.0

Data Interface

We follow the following interface wherever possible to make it easy to swap datasets.

{
    "id": tfds.features.Text(),
    "signer": tfds.features.Text() | tf.int32,
    "video": tfds.features.Video(shape=(None, HEIGHT, WIDTH, 3)),
    "depth_video": tfds.features.Video(shape=(None, HEIGHT, WIDTH, 1)),
    "fps": tf.int32,
    "pose": {
        "data": tfds.features.Tensor(shape=(None, 1, POINTS, CHANNELS), dtype=tf.float32),
        "conf": tfds.features.Tensor(shape=(None, 1, POINTS), dtype=tf.float32)
    },
    "gloss": tfds.features.Text(),
    "text": tfds.features.Text()
}

Adding a new dataset

For general instructions, see the TFDS guide to writing custom datasets. Instructions below are specific to this repository.

Make a new folder inside sign_language_datasets/datasets with the same name as the dataset. As a convention, the name of the dataset should be lowercase and words should be separated by an underscore. Example:

cd sign_language_datasets/datasets
tfds new new_dataset

For our purposes, creating a custom TFDS dataset means writing a new class which inherits from tfds.core.GeneratorBasedBuilder. If you use tfds new to create a new dataset then the dataset class is stored in a file with the exact same name as the dataset, i.e. new_dataset.py. new_dataset.py must contain a line similar to:

class NewDataset(tfds.core.GeneratorBasedBuilder):

Registering a new dataset

The mechanism to add a custom dataset to TFDS' dataset registry is to import the class NewDataset. For this reason the folder sign_language_datasets/datasets/new_dataset must have an __init__.py file that imports the class NewDataset:

from .new_dataset import NewDataset

Even though the name of the class is NewDataset, it will be available for loading in lowercase and uppercase characters are interpreted as the start of a new word that should be separated with an underscore. This means that the class can be loaded as follows:

ds = tfds.load('new_dataset')

Generating checksums

The folder for the new dataset should contain a file checksums.tsv with checksums for every file in the dataset. This allows the TFDS download manager to check the integrity of the data it downloads. Use the tfds build tool to generate the checksum file:

tfds build --register_checksums new_dataset.py

Use a dataset configuration which includes all files (e.g. does include the video files if any) using the --config argument. The default behaviour is to build all configurations which might be redundant.

Why not Huggingface Datasets?

Huggingface datasets do not work well with videos. From the lack of native support of the video type, to lack of support of arbitrary tensors. Furthermore, they currently have memory leaks that prevent from saving even the smallest of video datasets.

Cite

@misc{moryossef2021datasets, 
    title={Sign Language Datasets},
    author={Moryossef, Amit and M\"{u}ller, Mathias},
    howpublished={\url{https://github.com/sign-language-processing/datasets}},
    year={2021}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sign-language-datasets-0.2.0.tar.gz (3.4 MB view details)

Uploaded Source

Built Distribution

sign_language_datasets-0.2.0-py3-none-any.whl (3.5 MB view details)

Uploaded Python 3

File details

Details for the file sign-language-datasets-0.2.0.tar.gz.

File metadata

File hashes

Hashes for sign-language-datasets-0.2.0.tar.gz
Algorithm Hash digest
SHA256 fd9d11d680675ad3411f2a2672934d6aec6b13dafe08fc40eea6cc4d61fba3a4
MD5 ed6eb321445c5369531938e3d42227d4
BLAKE2b-256 f933d088a84981f380bc0eaf2222aeec6fb76d8277f88b478eaa90de3c240945

See more details on using hashes here.

File details

Details for the file sign_language_datasets-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for sign_language_datasets-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 236c9ff25120f69d43bbfa9dae354601416b8cc53c82574d5936ecdf0a64bad5
MD5 c1d7b608f53e30331d8d508298453748
BLAKE2b-256 1870f36294a5d962eff58e468c6b67806b75dfedd7f97994d560d8e415b89602

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page