Skip to main content

Diverse Genomic Embedding Benchmark

Project description


title: DGEB app_file : leaderboard/app.py sdk: docker sdk_version: 4.36.1

Diverse Genomic Embedding Benchmark

GitHub release arXiv URL License Downloads

Installation | Usage | Leaderboard | Citing

DGEB is a benchmark for evaluating biological sequence models on functional and evolutionary information.

DGEB is designed to evaluate model embeddings using:

  • Diverse sequences accross the tree of life.
  • Diverse tasks that capture different aspects of biological function.
  • Both amino acid and nucleotide sequences.

The current version of DGEB consists of 18 datasets covering all three domains of life (Bacteria, Archaea and Eukarya). DGEB evaluates embeddings using six different embedding tasks: Classification, BiGene mining, Evolutionary Distance Similarity (EDS), Pair Classification, Clustering, and Retrieval.

We welcome contributions of new tasks and datasets.

Installation

Install DGEB using pip.

pip install dgeb

Usage

  • Launch evaluation using the python script (see cli.py):
dgeb --model facebook/esm2_t6_8M_UR50D
  • To see all supported models and tasks:
dgeb --help
  • Using the python API:
import dgeb

model = dgeb.get_model("facebook/esm2_t6_8M_UR50D")
tasks = dgeb.get_tasks_by_modality(dgeb.Modality.PROTEIN)
evaluation = dgeb.DGEB(tasks=tasks)
evaluation.run(model, output_folder="results")

Using a custom model

Custom models should be wrapped with the dgeb.models.BioSeqTransformer abstract class, and specify the modality, number of layers, and embedding dimension. See models.py for additional examples on custom model loading and inference.

import dgeb
from dgeb.models import BioSeqTransformer
from dgeb.tasks.tasks import Modality

class MyModel(BioSeqTransformer):

    @property
    def modality(self) -> Modality:
        return Modality.PROTEIN

    @property
    def num_layers(self) -> int:
        return self.config.num_hidden_layers

    @property
    def embed_dim(self) -> int:
        return self.config.hidden_size


model = MyModel(model_name='path_to/huggingface_model')
tasks = dgeb.get_tasks_by_modality(model.modality)
evaluation = dgeb.DGEB(tasks=tasks)
evaluation.run(model)

Evaluating on a custom dataset

We strongly encourage users to contribute their custom datasets to DGEB. Please open a PR adding your dataset so that the community can benefit!

To evaluate on a custom dataset, first upload your dataset to the Huggingface Hub. Then define a Task subclass with TaskMetadata that points to your huggingface dataset. For example, a classification task on a custom dataset can be defined as follows:

import dgeb
from dgeb.models import BioSeqTransformer
from dgeb.tasks import Dataset, Task, TaskMetadata, TaskResult
from dgeb.tasks.classification_tasks import run_classification_task

class MyCustomTask(Task):
    metadata = TaskMetadata(
        id="my_custom_classification",
        display_name="...",
        description="...",
        type="classification",
        modality=Modality.PROTEIN,
        datasets=[
            Dataset(
                path="path_to/huggingface_dataset",
                revision="...",
            )
        ],
        primary_metric_id="f1",
    )

    def run(self, model: BioSeqTransformer) -> TaskResult:
        return run_classification_task(model, self.metadata)

model = dgeb.get_model("facebook/esm2_t6_8M_UR50D")
evaluation = dgeb.DGEB(tasks=[MyCustomTask])
evaluation.run(model)

Leaderboard

To add your submission to the DGEB leaderboard, proceed through the following instructions.

  1. Fork the DGEB repository by following GitHub's instruction Forking Workflow.

  2. Add your submission .json file to the leaderboard/submissions/<HF_MODEL_NAME>/ directory.

mv /path/to/<SUBMISSION_FILE>.json /path/to/DGEB/leaderboard/submissions/<HF_MODEL_NAME>/
  1. Update your fork with the new submission:
git add leaderboard/submissions/<HF_MODEL_NAME>/<SUBMISSION_FILE>.json
git commit -m "Add submission for <HF_MODEL_NAME>"
git push
  1. Open a pull request to the main branch of the repository via the Github interface.

  2. Once the PR is review and merged, your submission will be added to the leaderboard!

Acknowledgements

DGEB follows the design of text embedding bechmark MTEB developed by Huggingface 🤗. The evaluation code is adapted from the MTEB codebase.

Citing

DGEB was introduced in "Diverse Genomic Embedding Benchmark for Functional Evaluation Across the Tree of Life", feel free to cite:

TODO

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dgeb-0.1.0.tar.gz (161.9 kB view details)

Uploaded Source

Built Distribution

dgeb-0.1.0-py3-none-any.whl (296.0 kB view details)

Uploaded Python 3

File details

Details for the file dgeb-0.1.0.tar.gz.

File metadata

  • Download URL: dgeb-0.1.0.tar.gz
  • Upload date:
  • Size: 161.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for dgeb-0.1.0.tar.gz
Algorithm Hash digest
SHA256 b63c7a538b15c4494863d7557d5e4786ce2767c442a07989300c7da9c1239195
MD5 a3b8485b0390f2d2b4cca292123f8ab5
BLAKE2b-256 83e633a29df54baa52eb2a9ae4a09def82a068aeeea8a42958d2d969513c8e81

See more details on using hashes here.

File details

Details for the file dgeb-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: dgeb-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 296.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for dgeb-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 00825f4982ec99b674b3bcd08e94d8d998ce5a5e525eda467f462ec26a9dad24
MD5 7b2749934439c48c26522ef4f8acad41
BLAKE2b-256 ad9bc626121ca67730f17ea5e1029c073f01806d1bdf603b796d0b267a7d993c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page