Skip to main content

Flexible normalization methods for proteomics quantitative data

Project description

Pronoms: Proteomics Normalization Python Library

Overview

Pronoms is a Python library implementing multiple normalization methods for quantitative proteomics data. Each normalization method is encapsulated within modular, reusable classes. The library includes visualization capabilities that allow users to easily observe the effects of normalization. Some normalization methods, such as VSN normalization, leverage R on the backend for computation.

Installation

Prerequisites

  • Python 3.9 or higher
  • For R-based normalizers (VSN):
    • R installed on your system
    • Required R packages: vsn

Installing from PyPI

pip install pronoms

Installing for Development

# Clone the repository
git clone https://github.com/yourusername/pronoms.git
cd pronoms

# Install in development mode with dev dependencies
pip install -e .[dev]

Usage

Basic Example

import numpy as np
from pronoms.normalizers import MedianNormalizer

# Create sample data
data = np.random.rand(5, 100)  # 5 samples, 100 proteins/features

# Create normalizer and apply normalization
normalizer = MedianNormalizer()
normalized_data = normalizer.normalize(data)

# Visualize the effect of normalization
normalizer.plot_comparison(data, normalized_data)

Available Normalizers

  • DirectLFQNormalizer: Performs protein quantification directly from peptide/ion intensity data using the DirectLFQ algorithm. Ammar C, Schessner JP, Willems S, Michaelis AC, Mann M. Accurate Label-Free Quantification by directLFQ to Compare Unlimited Numbers of Proteomes. Mol Cell Proteomics. 2023 Jul;22(7):100581. doi:10.1016/j.mcpro.2023.100581. PMID: 37225017
  • L1Normalizer: Scales samples to have a unit L1 norm (sum of absolute values).
  • MADNormalizer: Median Absolute Deviation Normalization. Robustly scales samples by subtracting the median and dividing by the Median Absolute Deviation (MAD).
  • MedianNormalizer: Scales each sample (row) by its median, then rescales by the mean of medians to preserve overall scale.
  • MedianPolishNormalizer: Tukey's Median Polish. Decomposes data (often log-transformed) into overall, row, column, and residual effects by iterative median removal.
  • QuantileNormalizer: Normalizes samples to have the same distribution using quantile mapping.
  • SPLMNormalizer: Stable Protein Log-Mean Normalization. Uses stably expressed proteins (low log-space CV) to derive scaling factors for normalization in log-space, then transforms back.
  • VSNNormalizer: Variance Stabilizing Normalization (via R's vsn package). Stabilizes variance across the intensity range. Huber W, von Heydebreck A, Sültmann H, Poustka A, Vingron M. Variance stabilization applied to microarray data calibration and to the quantification of differential expression. Bioinformatics. 2002;18 Suppl 1:S96–104. doi:10.1093/bioinformatics/18.suppl_1.s96. PMID: 12169536

Data Format

All normalizers expect data in the format of a 2D numpy array or pandas DataFrame with shape (n_samples, n_features) where:

  • Each row represents a sample
  • Each column represents a protein/feature

This follows the standard convention used in scikit-learn and other Python data science libraries.

R Integration

For normalizers that use R (VSN), ensure R is properly installed and accessible. The library uses rpy2 to interface with R.

Installing Required R Packages

The VSN package is part of Bioconductor. In R, run the following commands:

if (!require("BiocManager", quietly = TRUE))
    install.packages("BiocManager")

BiocManager::install("vsn")

Development

  • Run tests: pytest

License

This project is licensed under the Apache License License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pronoms-0.1.0.tar.gz (41.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pronoms-0.1.0-py3-none-any.whl (34.0 kB view details)

Uploaded Python 3

File details

Details for the file pronoms-0.1.0.tar.gz.

File metadata

  • Download URL: pronoms-0.1.0.tar.gz
  • Upload date:
  • Size: 41.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.3

File hashes

Hashes for pronoms-0.1.0.tar.gz
Algorithm Hash digest
SHA256 5e98b8423b2fc44abbacc36c4424bbcdf78e27b981e784a03da9b835f611887c
MD5 646efc05dc9b8098746431bb0f28f8f1
BLAKE2b-256 cd677e43e568bdc49c6765d54ca021c4704c2709ea113a9224ac69aabc220c8a

See more details on using hashes here.

File details

Details for the file pronoms-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: pronoms-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 34.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.3

File hashes

Hashes for pronoms-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3b5abd6511811a6ca880b2f02dbb48aeff3207d5452cbeb722b1f0c08faed192
MD5 c9d9d9d524e57245def5485747ae2849
BLAKE2b-256 fa3ebd595e506199b91d76d28eea1e4b560c267d84096e0fa0098dad7128dc7b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page