Skip to main content

A library to quickly build QSAR models

Project description

Ersilia's LazyQSAR

A library to build supervised models for chemistry fastly.

Installation

Install LazyQSAR from source:

git clone https://github.com/ersilia-os/lazy-qsar.git
cd lazy-qsar
python -m pip install -e .

To use the default Lazy QSAR descriptors, please install them:

python -m pip install -e .[descriptors]

Binary Classification

LazyQSAR's binary classifier can run either with default descriptors or with custom descriptors passed by the user.

Built-in descriptors

Instantiate the LazyBinaryQSAR class with a mode of choice (fast, default, slow):

from lazyqsar.qsar import LazyBinaryQSAR

model = LazyBinaryQSAR(mode="fast")
model.fit(smiles_list=smiles_train, y=y_train)
y_hat = model.predict_proba(smiles_list=smiles_test)[:,1]

Custom-made descriptors

Pre-calculate your descriptors using the preferred method. We recommend using the Ersilia Model Hub to that end. The .h5 format generated by Ersilia can be directly passed to the LazyQSAR pipeline. Alternatively, just pass the descriptors as an array in-memory.

from lazyqsar.agnostic import LazyBinaryClassifier

model = LazyBinaryClassifier()
model.fit(X=X_train, y=y_train)
y_hat = model.predict_proba(X=X_test)[:,1]

Using saved models at inference time

By default, models are saved as ONNX files. When a model is trained, you can simply load it using an artifact. In this case, the only crucial dependency is the ONNX runtime.

To save a model, simply run:

model.save(model_dir)

This will create a folder with ONNX files in it. You can use with the artifact.

from lazyqsar.artifacts import LazyBinaryClassifierArtifact

model = LazyBinaryClassifier.load(model_dir)
y_hat = model.predict_proba(X=X)[:,1]

Tests and benchmarks

Quick testing

In the /tests folder you can find a quick implementation of the methods described for easily checking that code is working. The Bioavailability dataset and Chemeleon descriptors are used as an example.

python test/test_binary_classification.py
python test/test_binary_classification.py --agnostic

Benchmarking

In the benchmark repository you will find the performance of the default estimators and descriptors on the TDCommons ADMET dataset. This is a provisional benchmark. The team is working on a more exhaustive one.

Disclaimer

This library is only intended for quick-and-dirty QSAR modeling. For a more complete automated QSAR modeling, please refer to Zaira Chem.

About us

Learn about the Ersilia Open Source Initiative!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lazyqsar-2.1.1.tar.gz (50.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lazyqsar-2.1.1-py3-none-any.whl (66.7 kB view details)

Uploaded Python 3

File details

Details for the file lazyqsar-2.1.1.tar.gz.

File metadata

  • Download URL: lazyqsar-2.1.1.tar.gz
  • Upload date:
  • Size: 50.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.2.1 CPython/3.12.3 Linux/6.11.0-1018-azure

File hashes

Hashes for lazyqsar-2.1.1.tar.gz
Algorithm Hash digest
SHA256 746d32350dd480dfb5290adb1aa0e973666d1d3f7340f41b77777605340a977a
MD5 48e66e386edef1d2f73fbf39c8de1071
BLAKE2b-256 b98c8458bb4649c62ca85038cb5689c4e1038548572d30e0c56eefc70335b151

See more details on using hashes here.

File details

Details for the file lazyqsar-2.1.1-py3-none-any.whl.

File metadata

  • Download URL: lazyqsar-2.1.1-py3-none-any.whl
  • Upload date:
  • Size: 66.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.2.1 CPython/3.12.3 Linux/6.11.0-1018-azure

File hashes

Hashes for lazyqsar-2.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ae9d28ded00cf681768aa05ff25c6d9bd63e44ec23a29535b63faa0618edeb6c
MD5 f1aaa244c1e0acb41d97fe6aa3fa6e10
BLAKE2b-256 2fea68b04344b9e6eff351ce37c7e67ebbd426f1a9a6a7f87c978edb2b521c19

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page