Skip to main content

A library to quickly build QSAR models

Project description

Ersilia's LazyQSAR

A library to build supervised models for chemistry fastly.

Installation

Install LazyQSAR from source:

git clone https://github.com/ersilia-os/lazy-qsar.git
cd lazy-qsar
python -m pip install -e .

To use the default Lazy QSAR descriptors, please install them:

python -m pip install -e .[descriptors]

Binary Classification

LazyQSAR's binary classifier can run either with default descriptors or with custom descriptors passed by the user.

Built-in descriptors

Instantiate the LazyBinaryQSAR class with a mode of choice (fast, default, slow):

from lazyqsar.qsar import LazyBinaryQSAR

model = LazyBinaryQSAR(mode="fast")
model.fit(smiles_list=smiles_train, y=y_train)
y_hat = model.predict_proba(smiles_list=smiles_test)[:,1]

Custom-made descriptors

Pre-calculate your descriptors using the preferred method. We recommend using the Ersilia Model Hub to that end. The .h5 format generated by Ersilia can be directly passed to the LazyQSAR pipeline. Alternatively, just pass the descriptors as an array in-memory.

from lazyqsar.agnostic import LazyBinaryClassifier

model = LazyBinaryClassifier()
model.fit(X=X_train, y=y_train)
y_hat = model.predict_proba(X=X_test)[:,1]

Using saved models at inference time

By default, models are saved as ONNX files. When a model is trained, you can simply load it using an artifact. In this case, the only crucial dependency is the ONNX runtime.

To save a model, simply run:

model.save(model_dir)

This will create a folder with ONNX files in it. You can use with the artifact.

from lazyqsar.artifacts import LazyBinaryClassifierArtifact

model = LazyBinaryClassifier.load(model_dir)
y_hat = model.predict_proba(X=X)[:,1]

Tests and benchmarks

Quick testing

In the /tests folder you can find a quick implementation of the methods described for easily checking that code is working. The Bioavailability dataset and Chemeleon descriptors are used as an example.

python test/test_binary_classification.py
python test/test_binary_classification.py --agnostic

Benchmarking

In the benchmark repository you will find the performance of the default estimators and descriptors on the TDCommons ADMET dataset. This is a provisional benchmark. The team is working on a more exhaustive one.

Disclaimer

This library is only intended for quick-and-dirty QSAR modeling. For a more complete automated QSAR modeling, please refer to Zaira Chem.

About us

Learn about the Ersilia Open Source Initiative!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lazyqsar-2.1.3.tar.gz (50.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lazyqsar-2.1.3-py3-none-any.whl (67.4 kB view details)

Uploaded Python 3

File details

Details for the file lazyqsar-2.1.3.tar.gz.

File metadata

  • Download URL: lazyqsar-2.1.3.tar.gz
  • Upload date:
  • Size: 50.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.2.1 CPython/3.12.3 Linux/6.11.0-1018-azure

File hashes

Hashes for lazyqsar-2.1.3.tar.gz
Algorithm Hash digest
SHA256 ce739a0425c67252ba18b80c63b2f05ff022a8aee1d89adc6815efdb13ae5bfe
MD5 08260a52ea1db66ed4b45354f5f3216c
BLAKE2b-256 a368237e397f9b47f4ae37d3341c4a269a72655f01d7dc8474ed8c4235384d3b

See more details on using hashes here.

File details

Details for the file lazyqsar-2.1.3-py3-none-any.whl.

File metadata

  • Download URL: lazyqsar-2.1.3-py3-none-any.whl
  • Upload date:
  • Size: 67.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.2.1 CPython/3.12.3 Linux/6.11.0-1018-azure

File hashes

Hashes for lazyqsar-2.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 c5e742bda63f6d793e6163d78d035714e99310f11e94eea18c4e68ee87751dce
MD5 ee5c203890417ab55151bfd8819c844f
BLAKE2b-256 719ce13f3b223bba080be3fda95e2915157af8c89bb6fca856f12362a9a7043d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page