Skip to main content

description for polyphy testing

Project description

header_narrow

PolyPhyHub - PolyPhy License issues - PolyPhy Python Package image Documentation Status REUSE status

PolyPhy

PolyPhy is an unconventional toolkit for reconstructing continuous networks out of sparse 2D or 3D data. Such data can be defined as collections of discrete points, or a continuous sparse scalar field. PolyPhy produces a scalar density field that defines the recovered network structure. With the help of GPU-accelerated simulation and visualization, PolyPhy provides domain experts an interactive way to reconstruct discrete geometric data with an underlying network structure. The reconstruction is driven by the Monte Carlo Physarum Machine algorithm, a metaheuristic inspired by the morphology and dynamics of Physarum polycephalum aka 'slime mold'.

Related Resources

System Requirements

  • Decent GPU, recommended a mid-range discrete one
    • currently running best on NVIDIA GPUs, other brands supported as well (subject to the current capabilities of the Taichi API)
    • CPU fallback available for debugging purposes
  • Recent Windows, Linux or Mac OS
  • Python 3.x, Anaconda recommended

Repository

The main repository is located at the following GitHub URL:
https://github.com/PolyPhyHub/PolyPhy.git

The other repositories are linked from the following "org" page:
https://github.com/PolyPhyHub/

Running PolyPhy

Please note the project is currently undergoing a significant refactoring in order to streamline the use of the software in CLI, improve its modularity (making it easier to implement custom pipelines and extend the existing ones), and add new features (such as the recent addition of batch mode).

To install PolyPhy, clone this repository, open a Python console, navigate to the root of the repo, and run

pip install -r requirements.txt

Afterwards, navigate to the ./experiments/jupyter/production/ and run

python polyphy_2DDiscrete.py -f "data/csv/sample_2D_linW.csv"

for the standard 2D pipeline, or

python polyphy_3DDiscrete.py -f "data/csv/sample_3D_linW.csv"

to invoke the standard 3D discrete pipeline on sample data. You can also specify a custom CSV file (see the sample data for the format details, typically the data are tuples with 2 or 3 spatial coorinates followed by weights for each data point). The functionality of these pipelines is described below.

To display help on the available CLI parameters, simply run the respective command without any arguments.

There is also a number of notebooks implementing various pipelines (some of which are documented below). These are updated to different degrees, and we are in the process of porting them to the refactored class structure. Updates coming soon.

Functionality

The use-cases currently supported by PolyPhy are divided according to the data workflow they are built around. Each use-case has a corresponding Jupyter notebook that implements it located in ./experiments/Jupyter. This section reviews them case by case, and the following section provides an extensive tutorial recorded at the recent OSPO Symposium 2022.

  • 2D self-patterning is the most basic use-case implemented within the ./experiments/Jupyter/PolyPhy_2D_discrete_data notebook. The ability of MCPM to generate a diversity of patterns with network characteristics is achieved by disabling the data marker deposition, leaving only the MCPM agents to generate the marker responsible for maintaining structure.

    2D_self-patterning

  • 2D procedural pipeline provide an easy environment to experiment with the behavior of PolyPhy in the presence of discrete data with different spatial frequencies. Editing (adding new data points) is also supported. This pipeline is implemented in the ./experiments/Jupyter/PolyPhy_2D_discrete_data notebook.

    2D_discrete_procedural

  • 2D discrete pipeline implements the canonical way of working with custom data defined by a CSV file. The example below demonstrates fitting to a 2D projection of the SDSS galaxy dataset. This pipeline is implemented in the ./experiments/Jupyter/PolyPhy_2D_discrete_data notebook.

    2D_discrete_explicit

  • 2D continuous pipeline demonstrates the workflow with a continuous user-provided dataset. Instead of a discrete set of points as in the previous use-cases, the data is defined by a scalar field, which in 2D amounts to a grayscale image. The example below approximates the US road network using only a sparse population density map as the input. This pipeline is implemented in the ./experiments/Jupyter/PolyPhy_2D_continuous_data notebook.

    2D_continuous

  • 3D discrete pipeline represents an equivalent functionality to the original Polyphorm implementation. The dataset consists of SDSS galaxies defined as a weighted collection of 3D points. THe visualization is based on volumetric ray marching simultaneously fetching the deposit and the trace fields. This pipeline is implemented in the ./experiments/Jupyter/PolyPhy_3D_discrete_data notebook.

    3D_discrete_explicit

How to Use PolyPhy

Below is a recording of the PolyPhy Workshop given as part of the OSPO Symposium 2022.
This 93-minute workshop covers PolyPhy's research background, all of the 5 above usecases, and extended technical discussion.

Services

Tox

Tox is a virtual environment management and test tool that allows you to define and run custom tasks that call executables from Python packages. Tox will download the dependencies you have specified, build the package, install it in a virtual environment and run the tests using pytest. Make sure to install tox in the root of your project if you intend to work on the development.

tox # download dependencies, build and install package, run tests
tox -e docs  # to build your documentation
tox -e build  # to build your package distribution
tox -e publish  # to test your project uploads correctly in test.pypi.org
tox -e publish --repository pypi  # to release your package to PyPI
tox -av  # to list all the tasks available

GitHub Actions

GitHub Actions is being used to test on MacOs as well as Linux. It allows for the automation of the building, testing, and deployment pipline.

Codecov

A service that generates a visual report of how much code has been tested. All configuration settings can be found in the codecov.yml file.

Appveyor

A service that can be used to test Windows. All configuration settings can be found in the appveyor.yml file.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

polyphy-rule-0.0.2.tar.gz (10.2 MB view details)

Uploaded Source

Built Distribution

polyphy_rule-0.0.2-py3-none-any.whl (29.6 kB view details)

Uploaded Python 3

File details

Details for the file polyphy-rule-0.0.2.tar.gz.

File metadata

  • Download URL: polyphy-rule-0.0.2.tar.gz
  • Upload date:
  • Size: 10.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.0

File hashes

Hashes for polyphy-rule-0.0.2.tar.gz
Algorithm Hash digest
SHA256 1cccb534f721c52a9fa8ea82addf40bd181a1faf60971325560dd4f8141bbb47
MD5 e7e7b315a5a3038f7e54fd1e00b2a809
BLAKE2b-256 86eb4381c0e2bc029d3196a20f3a867c412084e8b0c3b09a026513850b5f8f13

See more details on using hashes here.

File details

Details for the file polyphy_rule-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: polyphy_rule-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 29.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.0

File hashes

Hashes for polyphy_rule-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 1be18ab714ee146a6abe89cd54767059491fba299480e083dabf18c61f6ce893
MD5 20bebf4eef157552891720f3039de417
BLAKE2b-256 ff0bfe86f9a9f60937d54b046720861d65519654cbd84676fb6849e36e448fe8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page