Skip to main content

Cheminformatics toolkit for property calculation, filtering, and QSAR modeling

Project description

Pharmalyzer

Pharmalyzer is a Python package for data preprocessing, screening, and early-stage analysis of chemical datasets. It provides a set of rule-based and RDKit-powered tools for screening, filtering, and assessing compounds' ADME properties.It enables fast and reliable computation of physicochemical and pharmacokinetic properties from SMILES strings, supporting cheminformatics and drug discovery workflows.

Features

  • Physicochemical Properties: MW, LogP, TPSA, H-bond donors/acceptors, etc.
  • ADME Prediction: GI absorption, BBB permeability, logKp, excretion
  • Rule-Based Filtering: Lipinski, Veber, Ghose, PAINS, Brenk filters
  • QSAR Modeling Tools: Encoding, feature selection, scaling, outlier detection
  • Similarity & Integration: Compound comparison, data merging
  • ChEMBL Integration: Fetch data directly from the ChEMBL database

Installation

pip install Pharmalyzer-0.1-py3-none-any.whl

---

## 🧪 Quickstart Example

```python
from Pharmalyzer import cleaner, Drug_rules, ADME

# Load a sample CSV file containing SMILES
import pandas as pd
df = pd.read_csv("Pharmalyzer/Pharmalyzer/sample_data.csv")

# Clean the data
df_clean = cleaner.clean_smiles(df, smiles_col="SMILES")

# Apply Lipinski rule filter
df_lipinski = Drug_rules.lipinski_filter(df_clean)

# Calculate ADME properties
df_adme = ADME.calculate_properties(df_lipinski)

print(df_adme.head())

🧰 Module Overview

Module Description
cleaner.py Standardizes, removes salts, and cleans SMILES strings
Drug_rules.py Filters compounds using rules like Lipinski, Ghose, PAINS
ADME.py Computes key ADME properties and predictions
toxicity.py Predicts potential toxicity risks
qsar.py Builds and evaluates QSAR models
encoder.py, scaler.py Preprocessing tools for ML pipelines
chembl_client.py Fetches compound data from ChEMBL
feature_selection.py Feature reduction and selection techniques
filtering.py, outliers.py Additional data cleaning tools
integrate.py, similarity.py Merging datasets, Tanimoto similarity calculations

🖼️ Before vs After Cleaning (Example Visualization)

(Replace with actual plot)

from Pharmalyzer import cleaner
import matplotlib.pyplot as plt

# Before cleaning
df = pd.read_csv("sample_data.csv")
print("Before:", len(df))

# After cleaning
df_clean = cleaner.clean_smiles(df)
print("After:", len(df_clean))

cleaning_comparison.png


License

MIT License

Author

Created by [Your Name]
📧 s.hassani@alum.semnan.ac.ir & sorour.hasani@gmail.com

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pharmalyzer-0.1.1.tar.gz (17.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pharmalyzer-0.1.1-py3-none-any.whl (21.5 kB view details)

Uploaded Python 3

File details

Details for the file pharmalyzer-0.1.1.tar.gz.

File metadata

  • Download URL: pharmalyzer-0.1.1.tar.gz
  • Upload date:
  • Size: 17.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.7

File hashes

Hashes for pharmalyzer-0.1.1.tar.gz
Algorithm Hash digest
SHA256 219c85acea2d836914f669655aa85c257d1a5455c35dc6fb1cb8603e9739c58a
MD5 b1afcebbca4f15a6809a1d53dbf71743
BLAKE2b-256 9375e70f82bad6ee566f6dc9cde05179fc3baf09267924135e817f087309f0a1

See more details on using hashes here.

File details

Details for the file pharmalyzer-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: pharmalyzer-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 21.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.7

File hashes

Hashes for pharmalyzer-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 5a88769a8088b88dc35924af1d74c15e5f83a9cac5f9ee7d0ac07ed520b1dbbb
MD5 4e23d113c784144cc43555c68f266745
BLAKE2b-256 24b2b9f092bfc56420a3f4b6e3e9a7a83eada093c2d62d1eafeb2f4ba4f02f75

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page