Skip to main content

A Python Framework for Modeling and Analysis of Signaling Systems

Project description

BioMASS

Actions Status Language grade: Python License: MIT Downloads PyPI version PyPI pyversions Code style: black

Mathematical modeling is a powerful method for the analysis of complex biological systems. Although there are many researches devoted on producing models to describe dynamical cellular signaling systems, most of these models are limited and do not cover multiple pathways. Therefore, there is a challenge to combine these models to enable understanding at a larger scale. Nevertheless, larger network means that it gets more difficult to estimate parameters to reproduce dynamic experimental data needed for deeper understanding of a system.

To overcome this problem, we developed BioMASS, a Python framework for Modeling and Analysis of Signaling Systems. The BioMASS framework allows efficient optimization of multiple parameter sets simultaneously and generates the multiple parameter candidates that explain the signaling dynamics of interest. These parameter candidates can be further evaluated by their distribution and sensitivity analysis as a part of alternative information about the hidden regulatory mechanism of the system.

Features

  • parameter estimation of ODE models
  • local sensitivity analysis
  • effective visualization of simulation results

Installation

The BioMASS library is available on PyPI.

$ pip install biomass

BioMASS supports Python 3.7 or newer.

Usage

We will use the model of immediate-early gene response (Nakakuki_Cell_2010) for parameter estimation, visualization of simulation results and sensitivity analysis.

Model Preparation

from biomass.models import Nakakuki_Cell_2010

Nakakuki_Cell_2010.show_info()
Nakakuki_Cell_2010 information
------------------------------
36 species
115 parameters, of which 75 to be estimated
model = Nakakuki_Cell_2010.create()

Parameter Estimation of ODE Models (n = 1, 2, 3, · · ·)

Parameters are adjusted to minimize the distance between model simulation and experimental data.

from biomass import optimize

optimize(
    model=model, start=1, options={
        "popsize": 3,
        "max_generation": 100,
        "allowable_error": 0.5,
        "local_search_method": "DE",
        "maxiter": 50,
    }
)

The temporary result will be saved in out/n/ after each iteration.

Progress list: out/n/optimization.log

Generation1: Best Fitness = 5.864228e+00
Generation2: Best Fitness = 5.864228e+00
Generation3: Best Fitness = 4.488934e+00
Generation4: Best Fitness = 3.793744e+00
Generation5: Best Fitness = 3.652047e+00
Generation6: Best Fitness = 3.652047e+00
Generation7: Best Fitness = 3.652047e+00
Generation8: Best Fitness = 3.452999e+00
Generation9: Best Fitness = 3.180878e+00
Generation10: Best Fitness = 1.392501e+00
Generation11: Best Fitness = 1.392501e+00
Generation12: Best Fitness = 1.392501e+00
Generation13: Best Fitness = 1.392501e+00
Generation14: Best Fitness = 7.018051e-01
Generation15: Best Fitness = 7.018051e-01
Generation16: Best Fitness = 7.018051e-01
Generation17: Best Fitness = 7.018051e-01
Generation18: Best Fitness = 7.018051e-01
Generation19: Best Fitness = 6.862063e-01
Generation20: Best Fitness = 6.862063e-01
  • If you want to continue from where you stopped in the last parameter search,
from biomass import optimize_continue

optimize_continue(
    model=model, start=1, options={
        "popsize": 3,
        "max_generation": 200,
        "allowable_error": 0.5,
        "local_search_method": "DE",
        "maxiter": 50,
    }
)
  • If you want to search multiple parameter sets (e.g., from 1 to 10) simultaneously,
from biomass import optimize

optimize(
    model=model, start=1, end=10, options={
        "popsize": 5,
        "max_generation": 100,
        "allowable_error": 0.5,
        "local_search_method": "DE",
        "maxiter": 50,
    }
)
  • Exporting optimized parameters in CSV format
from biomass.result import OptimizationResults

res = OptimizationResults(model)
res.to_csv()

Visualization of Simulation Results

from biomass import run_simulation

run_simulation(model, viz_type='average', show_all=False, stdev=True)

simulation_average

Points (blue diamonds, EGF; red squares, HRG) denote experimental data, solid lines denote simulations

Sensitivity Analysis

The single parameter sensitivity of each reaction is defined by

si(q(v),vi) = ln(q(v)) / ln(vi) = q(v) / vi · vi / q(v)

where vi is the ith reaction rate, v is reaction vector v = (v1, v2, ...) and q(v) is a target function, e.g., time-integrated response, duration. Sensitivity coefficients were calculated using finite difference approximations with 1% changes in the reaction rates.

from biomass import run_analysis

run_analysis(model, target='reaction', metric='integral', style='barplot')

sensitivity_PcFos

Control coefficients for integrated pc-Fos are shown by bars (blue, EGF; red, HRG). Numbers above bars indicate the reaction indices, and error bars correspond to simulation standard deviation.

Citation

When using BioMASS, please cite:

  • Imoto, H., Zhang, S. & Okada, M. A Computational Framework for Prediction and Analysis of Cancer Signaling Dynamics from RNA Sequencing Data—Application to the ErbB Receptor Signaling Pathway. Cancers. 12, 2878 (2020). https://doi.org/10.3390/cancers12102878

    @article{imoto2020computational,
      title={A Computational Framework for Prediction and Analysis of Cancer Signaling Dynamics from RNA Sequencing Data—Application to the ErbB Receptor Signaling Pathway},
      author={Imoto, Hiroaki and Zhang, Suxiang and Okada, Mariko},
      journal={Cancers},
      volume={12},
      number={10},
      pages={2878},
      year={2020},
      publisher={Multidisciplinary Digital Publishing Institute}
    }
    

Author

Hiroaki Imoto

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

biomass-0.3.5.tar.gz (78.0 kB view details)

Uploaded Source

Built Distribution

biomass-0.3.5-py3-none-any.whl (103.2 kB view details)

Uploaded Python 3

File details

Details for the file biomass-0.3.5.tar.gz.

File metadata

  • Download URL: biomass-0.3.5.tar.gz
  • Upload date:
  • Size: 78.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.8

File hashes

Hashes for biomass-0.3.5.tar.gz
Algorithm Hash digest
SHA256 b3429ba138aac44235834c99dd88e4cdf41ce51e55b0484bed3044d496243615
MD5 e3e944cf71c6b8c0dd640db0c93a0c09
BLAKE2b-256 94caca3a490d9b24b7eea68337f21d353c44db3f457275ef2cd37dae9d6fb199

See more details on using hashes here.

File details

Details for the file biomass-0.3.5-py3-none-any.whl.

File metadata

  • Download URL: biomass-0.3.5-py3-none-any.whl
  • Upload date:
  • Size: 103.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.8.8

File hashes

Hashes for biomass-0.3.5-py3-none-any.whl
Algorithm Hash digest
SHA256 ef2140e5c9d829912c00bb4c2f0bde47c855ca281ef9d57917067bbe6835ec8f
MD5 2c0f6a4f47b2e8551e29444d3d6b70ab
BLAKE2b-256 9e2b6f6de1f10b5f11030fcd3efc0dbcee0df86589809438bd27524038ac44eb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page