Skip to main content

This package is the Python implementation of the CLINE method

Project description

PyCLINE - python package for CLINE

The pyCLINE package is the python package based on the CLINE (Computational Learning and Identification of NullclinEs). It can be downloaded from PyPI with pip by using

pip install pyCLINE

The package allows to recreate all data, models and results shown in Prokop, Billen, Frolov and Gelens (2025), and to apply CLINE to other data sets. In order to generate data used in Prokop, Billen, Frolov and Gelens (2025), a set of different models is being provided under pyCLINE.model. Data from these models can be generated using pyCLINE.generate_data(). For setting up the data prepartion and adjacent training a neural network, the submodule pyCLINE.recovery_methods is used. The submodule contains the module for data_preparation pyCLINE.recovery_methods.data_preparation and for neural network training pyCLINE.recovery_methods.nn_training.

For a better understanding, pyCLINE also contains the module pyCLINE.example which provides four examples also found in Prokop, Billen, Frolov and Gelens (2025) with step by step instructions on how to setup a CLINE pipeline.

Documentation of the package can be found hosted on Gitlab pages.

The structure of pyCLINE is shown here:

Example of use

As mentioned before, in order to run several examples, the module example can be called

import pyCLINE

pyCLINE.example(*example name*)

where example names are FHN (FitzHugh-Nagumo model), Bicubic (Bicubic model), GeneExpression (Gene expression) or DelayOscillator (Delay oscillator model) which are introduced in the manuscript.

Additionally, below you can find the example for the use of the FitzHugh-Nagumo (FHN) model without the use of pyCLINE.example. After installing the package, we import pyCLINE as well as torch.nn to be able to configure the activation function for the neural network training. Additionally, we load pandas to be able to load generated test data sets.

import pyCLINE
import torch.nn as nn
import pandas

Firstly, we start by generating the FHN data set, which we then can load as a Pandas Dataframe back into the file from the created data directory:

pyCLINE.generate_data.FHN(dt=0.1, N=1000000, epsilons=[0.3], n_intiaL_conditions=1)
df = pd.read_csv('data/synthetic/FHN_eps=0.3_a=0.0)

This prepared Dataframe consists of many simulations from randomly selected initial conditions, but we just want to use a single one and reset the index of the dataframe.

df_sim = df[(df['sim']==1)].copy()
df_sim.reset_index(drop=True, inplace=True)

This step can be skipped when using a single simulation with your data when you only have a single time series. After this we can prepare the data for training, where we declare the column names with a set of parameters used to normalize the data:

df_sim, df_coef = pyCLINE.recovery_methods.data_preparation.prepare_data(df_sim, vars=['u', 'v'], time='time', tmin=10, scheme='derivative', value_min=0.0, value_max=1.0)

We then can define the variables that will be used as input and output/target variables of the neural network, and split the datasets into training, test and validation:

input_train, target_train, input_test, target_test, input_val, target_val = pyCLINE.recovery_methods.data_preparation.shuffle_and_split(df_sim, input_vars = input_vars, target_var = target_vars, optimal_thresholding=False)

With the prepared data, we can set up the model and train it:

#set up
nn_model,  optimizer, loss_fn = recovery_methods.nn_training.configure_FFNN_model(Nin=len(input_vars), Nout=len(target_vars),Nlayers=3, Nnodes=64, summary=True, lr=1e-4, activation=nn.SiLU)

#training
training_loss, val_loss, test_loss, predictions_evolution, lc_predictions, _ = recovery_methods.nn_training.train_FFNN_model(model=nn_model, optimizer=optimizer, loss_fn=loss_fn, input_train=input_train,target_train=target_train,input_test=input_test, target_test=target_test, validation_data=(input_val, target_val), epochs=3000, batch_size=64, device='cpu',save_evolution=True,method='derivative', minimal_value=val_min,maximal_value=val_max)

The result of the training are the losses and the predictions of the limit cycle (lc_predictions) and nullcline predictions (predictions_evolution) over the set amount of epochs, which can be used to visualize the outcome of the nullcline predictions.

Contributing to pyCLINE

If you want to contribute to the pyCLINE package, you can do it so here on Gitlab by creating a feature either for bug reports or suggestions.

If you have already written code to fix bugs or add extensions then feel free to creat a pull request, however before putting substantial effort into major code changes please open an issue to discuss it.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pycline-0.1.12.tar.gz (25.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pycline-0.1.12-py3-none-any.whl (23.3 kB view details)

Uploaded Python 3

File details

Details for the file pycline-0.1.12.tar.gz.

File metadata

  • Download URL: pycline-0.1.12.tar.gz
  • Upload date:
  • Size: 25.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.10

File hashes

Hashes for pycline-0.1.12.tar.gz
Algorithm Hash digest
SHA256 e3e61db16cfb0b2e711669ad07374244b9fb01fefdbb6ea3e1bbd7a20f379722
MD5 c83689e376048f9603ee769075061b3e
BLAKE2b-256 91cd3c36b9bc2befbf35b66df7333c0145b15829743be029c75ade158ee1d6a6

See more details on using hashes here.

File details

Details for the file pycline-0.1.12-py3-none-any.whl.

File metadata

  • Download URL: pycline-0.1.12-py3-none-any.whl
  • Upload date:
  • Size: 23.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.10

File hashes

Hashes for pycline-0.1.12-py3-none-any.whl
Algorithm Hash digest
SHA256 d9df4851dd375ec24a498ef32c055be4629e835930a8a56ff02b3162d2ff60f2
MD5 760602007befae42a8b452c9e5e510be
BLAKE2b-256 4bcc89b728e006dc1ea8dfb6b75ab14f6b1a36e22568c127ec6e435a42ababaa

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page