Skip to main content

Easy parameter space evaluation and serial farming.

Project description

Psyrun

https://travis-ci.org/jgosmann/psyrun.svg?branch=master https://coveralls.io/repos/github/jgosmann/psyrun/badge.svg?branch=master

Psyrun is a Python tool to define parameter spaces and execute an evaluation function for each parameter assignment. In addition Psyrun makes it easy to use serial farming, i.e. evaluating multiple parameter assignments in parallel, on a multicore computers and high-performance clusters.

Documentation

The documentation can be found here.

Overview

Define parameter spaces and evaluate them:

from psyrun import map_pspace, Param

def objective(a, b, c):
    return a * b + c

pspace = (Param(a=np.arange(1, 5))
          * Param(b=np.linspace(0, 1, 10))
          * Param(c=[1., 1.5, 10., 10.5]))
results = map_pspace(objective, pspace)

Or do it in parallel:

from psyrun import map_pspace_parallel
results = map_pspace_parallel(objective, pspace)

Define tasks by placing task_<name>.py files in the psy-tasks` directory:

from psyrun import Param

pspace = (Param(a=np.arange(1, 5))
          * Param(b=np.linspace(0, 1, 10))
          * Param(c=[1., 1.5, 10., 10.5]))

def execute(a, b, c):
    return {'result': a * b + c}

and run them by typing psy run with support for serial farming on high performance clusters.

Installation

pip install psyrun

To be able to use the NPZ store:

pip install numpy
pip install 'psyrun[npz]'

To be able to use the HDF5 store:

pip install numpy
pip install 'psyrun[h5]'

Requirements

Optional requirements

To have faulthandler activated for jobs submitted with psy run in Python 2.7:

Python 3.4+ already includes the faulthandler module.

To use map_pspace_parallel:

To use NPZ files as store:

To use HDF5 files as store:

To run the unit tests:

To build the documentation:

Changes

0.6.0

New features

  • Add psy new-task and psy kill commands.

  • Added AutodetectStore that determines the appropriate store from the filename extension.

  • Added possibility to let psy merge custom stores if provided as psyrun.stores entry point.

  • Added capability to set scheduler arguments based on the job name.

0.5.4

Bug fixes

  • Fix the psy run continue functionality.

0.5.3

Bug fixes

  • Fix psy status and psyrun.backend.distribute.DistributeBackend.get_missing trying to read incompatible data files in the output directory.

  • Fix psy status and psyrun.backend.distribute.DistributeBackend.get_missing easily hitting Python’s recursion depth limit.

  • Fix merging of npz files with missing integer values by converting them to float where np.nan can be used.

0.5.2

Bug fixes

  • Fix incorrect psy status.

  • Fix psy run <task1> <task2> ... not running all tasks and run them in order.

0.5.1

Bug fixes

  • Fix psy merge always assuming PickleStore.

Documentation improvements

  • Add recipe for converting data to Pandas data frame to documentation.

0.5

  • Initial release

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

psyrun-0.6.0.tar.gz (44.5 kB view details)

Uploaded Source

Built Distribution

psyrun-0.6.0-py3-none-any.whl (37.2 kB view details)

Uploaded Python 3

File details

Details for the file psyrun-0.6.0.tar.gz.

File metadata

  • Download URL: psyrun-0.6.0.tar.gz
  • Upload date:
  • Size: 44.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for psyrun-0.6.0.tar.gz
Algorithm Hash digest
SHA256 8d44eabaf9cd6f2136c9feaee93d686d408d3dd9e9ce0fbf82cac0f429a7739e
MD5 3d910e2427d10ddc3b3e1a8f8f98c0b9
BLAKE2b-256 82658d545eabd72cf861b604169dadf1af71ce6aba808aefd37600a19a00ec81

See more details on using hashes here.

File details

Details for the file psyrun-0.6.0-py3-none-any.whl.

File metadata

File hashes

Hashes for psyrun-0.6.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3c2b7800037b75ed95f7ccad68a4e147ab024595a6c7a794338d5761a5471723
MD5 b8ee36816412afc1bc439faca9f64491
BLAKE2b-256 a6ff66fb6d8ea4a54157a1558ecad07d50724f2a716dccdca0f8404741be3b27

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page