Skip to main content

A library to manipulate data for our DMS prediction models.

Project description

Python CI Publish distributions 📦 to PyPI PyPI

Download your RNA data from HuggingFace with rouskinhf!

A repo to manipulate the data for our RNA structure prediction model. This repo allows you to:

  • pull datasets from the Rouskinlab's HuggingFace
  • create datasets from local files and push them to HuggingFace, from the formats:
    • .fasta
    • .ct
    • .json (DREEM output format)
    • .json (Rouskinlab's huggingface format)

Important notes

  • Sequences with bases different than A, C, G, T, U, N, a, c, g, t, u, n are not supported. The data will be filtered out.

Dependencies

Push a new release to Pypi

  1. Edit version to vx.y.z in pyproject.toml. Then run in a terminal git add . && git commit -m 'vx.y.z' && git push.
  2. Create and push a git tag vx.y.z by running in a terminal git tag 'vx.y.z' && git push --tag.
  3. Create a release for the tag vx.y.z on Github Release.
  4. Make sure that the Github Action Publish distributions 📦 to PyPI passed on Github Actions.

Installation

Get a HuggingFace token

Go to HuggingFace and create an account. Then go to your profile and copy your token (huggingface.co/settings/tokens).

Create an environment file

Open a terminal and type:

nano env

Copy paste the following content, and change the values to your own:

HUGGINGFACE_TOKEN="your token here"  # you must change this to your HuggingFace token
DATA_FOLDER="data/datafolders" # where the datafolder are stored by default, change it if you want to store it somewhere else
DATA_FOLDER_TESTING="data/input_files_for_testing" # Don't touch this
RNASTRUCTURE_PATH="/Users/ymdt/src/RNAstructure/exe" # Change this to the path of your RNAstructure executable
RNASTRUCTURE_TEMP_FOLDER="temp" # You can change this to the path of your RNAstructure temp folder

Then save the file and exit nano.

Source the environment

source env

Install the package with pip

pip install rouskinhf

Tutorials

Authentify your machine to HuggingFace

See the tutorial.

Download a datafolder from HuggingFace

See the tutorial.

Create a datafolder from local files and push it to HuggingFace

See the tutorial.

About

Sourcing the environment and keeping your environment variable secret

The variables defined in the env file are required by rouskinhf. Make that before you use rouskinhf, you run in a terminal:

source env

or, in a Jupyter notebook:

!pip install python-dotenv
%load_ext dotenv
%dotenv env

The point of using environment variables is to ensure the privacy of your huggingface token. Make sure to add your env file to your .gitignore, so your HuggingFace token doesn't get pushed to any public repository.

Import data with import_dataset

This repo provides a function import_dataset, which allows your to pull a dataset from HuggingFace and store it locally. If the data is already stored locally, it will be loaded from the local folder. The type of data available is the DMS signal and the structure, under the shape of paired bases tuples. The function has the following signature:

def import_dataset(name:str, data:str, force_download:bool=False)->np.ndarray:

    """Finds the dataset with the given name for the given type of data.

    Parameters
    ----------

    name : str
        Name of the dataset to find.
    data : str
        Name of the type of data to find the dataset for (structure or DMS).
    force_download : bool
        Whether to force download the dataset from HuggingFace Hub. Defaults to False.

    Returns
    -------

    ndarray
        The dataset with the given name for the given type of data.

    Example
    -------

    >>> import_dataset(name='for_testing', data='structure').keys()
    dict_keys(['references', 'sequences', 'structure'])
    >>> import_dataset(name='for_testing', data='DMS').keys()
    dict_keys(['references', 'sequences', 'DMS'])
    >>> import_dataset(name='for_testing', data='structure', force_download=True).keys()
    dict_keys(['references', 'sequences', 'structure'])
    >>> import_dataset(name='for_testing', data='DMS', force_download=True).keys()
    dict_keys(['references', 'sequences', 'DMS'])

FYI, the datafolder object

The datafolder object is a wrapper around your local folder and HuggingFace API, to keep a consistent datastructure across your datasets. It contains multiple methods to create datasets from various input formats, store the data and metadata in a systematic way, and push / pull from HuggingFace.

On HuggingFace, the datafolder stores the data under the following structure:

HUGGINGFACE DATAFOLDER
- [datafolder name]
    - source
        - whichever file(s) you used to create the dataset (fasta, set of CTs, etc.).
    - data.json # the data under a human readable format.
    - info.json # the metadata of the dataset. This file indicates how we got the DMS signal and the structures (directly from the source or from a prediction).
    - README.md # the metadata of the dataset in a human readable format.

Locally, we have the same structure with the addition of .npy files which contain the data in a machine readable format. Each .npy file contains a numpy array of the data, and the name of the file is the name of the corresponding key in the data.json file. The source file won’t be downloaded by default. Hence, the local structure is:

LOCAL DATAFOLDER
- [datafolder name]
    ...
    - README.md # the metadata of the dataset in a human readable format
    - references.npy
    - sequences.npy
    - base_pairs.npy
    - dms.npy

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

rouskinhf-0.2.1.tar.gz (20.2 kB view details)

Uploaded Source

Built Distribution

rouskinhf-0.2.1-py3-none-any.whl (21.1 kB view details)

Uploaded Python 3

File details

Details for the file rouskinhf-0.2.1.tar.gz.

File metadata

  • Download URL: rouskinhf-0.2.1.tar.gz
  • Upload date:
  • Size: 20.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for rouskinhf-0.2.1.tar.gz
Algorithm Hash digest
SHA256 7e10f15848b28ada53096c6b8bbd7396103f07ced90eb63ffdc7a5a4f59b615d
MD5 7a49b7b6b5a9cd8cb108965028dbce40
BLAKE2b-256 5b26d1a14c39a627bc2bbf6667bdfbedc9712046beaad9f5a8163b27071efb2f

See more details on using hashes here.

File details

Details for the file rouskinhf-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: rouskinhf-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 21.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for rouskinhf-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 219ea60de3788201cd095f5800eb3f320adca2e373b67d6f20dd1af1a676a322
MD5 4dc70a3bcfab6e8e0da8fde6c40998aa
BLAKE2b-256 e8bf1472185263f8a4210a98edbb2a4acec65344fff62f9b6c946768b79eb5ca

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page