Skip to main content

A tool for dataset ingestion and management.

Project description

Pygestor

A platform designed to seamlessly acquire, organize, and manage diverse datasets, offering AI researchers a one-line downloader and data-loader for quick access to data, while providing a scalable and easily manageable system for future dataset acquisition.

Python application License: MIT

Quick Start

Install dependencies

pip install -r requirements.txt
python .\run-gui.py

The module can be used with terminal commands or Python APIs (more functionalities). For Python APIs use cases please refer to this notebook.

Configurations

Edit pygestor/__init__.py to change the default system settings. In particular, set DATA_DIR to the desired data storage location, either a local path or a remote path, such as a mounted NFS.

Data info and availability

To list support datasets:

python cli.py -l

To list subsets in a datatset:

python cli.py -l -d <dataset_name>

To list partitions in a subset:

python cli.py -l -d <dataset_name> -s <subset_name>

Dataset management and extension

To download a specific subset:

python cli.py -l -d <dataset_name> -s <subset_name>

To download specific partitions, use Python API pygestor.download().

To remove downloaded data files in a subset:

python cli.py -r -d <dataset_name> -s <subset_name>

To support a new dataset, add a new class file to pygestor/datasets that defines how to organize, download and load data, following the example in pygestor/datasets/wikipedia.py. Then update the metadata by running python cli.py -init -d <new_dataset_name>

Technical Details

Storage

The data is stored in a file storage system and organized into three levels: dataset, subset (distinguished by version, language, class, split, annotation, etc.), and partition (splitting large files into smaller chunks for memory efficiency), as follows:

dataset_A
├── subset_a
│   ├── partition_1
│   └── partition_2
└── subset_b
    ├── partition_1
    └── partition_2
...

File storage is chosen for its comparatively high cost efficiency, scalability, and ease of management compared to other types of storage.

The dataset info and storage status is tracked by a metadata file metadata.json for efficient reference and update.

Dependencies

  • python >= 3.11
  • huggingface_hub: Provides native support for datasets hosted on Hugging Face, making it an ideal library for downloading.
  • pyarrow: Used to compress and extract parquet files, a data file format designed for efficient data storage and retrieval, compatible with pandas.
  • pandas: Used to load the text dataset into memory for downstream data consumers. It provides a handy API for data manipulation and access, as well as chunking and datatype adjustments for memory efficiency.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pygestor-0.1.0.tar.gz (11.5 kB view details)

Uploaded Source

Built Distribution

pygestor-0.1.0-py3-none-any.whl (12.1 kB view details)

Uploaded Python 3

File details

Details for the file pygestor-0.1.0.tar.gz.

File metadata

  • Download URL: pygestor-0.1.0.tar.gz
  • Upload date:
  • Size: 11.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.5

File hashes

Hashes for pygestor-0.1.0.tar.gz
Algorithm Hash digest
SHA256 ec8090d7018f7863b42512d66e9f00dd5b04f618e616e9de58387601a63f511b
MD5 cd26cfefc9995b60dc7020ba3185090b
BLAKE2b-256 54677a00567e372db520c412268ecea4146f626576818afa5331eccec5ab5f78

See more details on using hashes here.

File details

Details for the file pygestor-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: pygestor-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 12.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.5

File hashes

Hashes for pygestor-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 d7613d4402a5ed218ecc2123d9760ad722e2177ef7f53b09bbb8f22046cd9fe0
MD5 b314fffa101d672efc7a21749bcccfe0
BLAKE2b-256 08a6859fffeab87ecbe27cf977e0223c2b4ac793847ccbdf26c76137860290b0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page