Skip to main content

Visualizing Dynamic Programming on Tree Decompositions.

Project description

TdVisu

Maintenance PyPI license Tests codecov

GitHub release (latest SemVer including pre-releases) PyPI version fury.io GitHub commits since latest release (by SemVer) PyPI status PyPI pyversions PyPI - Wheel GitHub code size in bytes


Visualization for dynamic programming on tree decompositions.

Create a graph object for each given graph that is of interest for the dynamic programming.

The visualization generates highlights and adds solution-tables for user defined time steps.

These snapshot of the graphs will be written in a graphviz-supported file-format to a folder of your choosing.

For the portable and light weight '.svg' format, all graphs for a timestep can be joined together to provide a thoroughly view on the process of dynamic programming.

With the '.svg' format the images are highly customizable, and even combining several timesteps together using svg animate would be an option in the future.


Using

Note: see also the steps prepared in the CI/CD .github/workflows/python-app.yml:

Graphviz (>=2.38). Be aware of changes in default layouts over different major versions of Graphviz. The project currently tests with graphviz-version: "12.2.1".

python-benedict[xml]

PostgreSQL adapter for Python: psycopg (3)


To register the graphviz plugins

https://gitlab.com/graphviz/graphviz/-/issues/1352

dot.exe -c

To install

In a command prompt with pip (to get pip see: https://pip.pypa.io/en/stable/) installed: Just run

pip install -h (for more information on install options)
pip install tdvisu

To download the latest version from the default branch:

git clone --depth 1 --single https://github.com/VaeterchenFrost/tdvisu

To isolate the dependencies

With virtualenv on the system installed you can isolate the environment, for example

virtualenv tdvisu_dir -p 3.12
cd tdvisu_dir/bin/
source activate
# Windows: ./tdvisu_dir/Scripts/activate

With Conda on the system installed the dependencies for this project can be automatically installed in a new environment:

Go to the projects base directory.

Open a conda-command-prompt with admin privileges and run the commands from the project folder

  • to create a new environment with basic dependencies:
conda env create -f ./environment.yml
  • to activate the environment:
conda activate tdvisu

Install from source

To clone the complete repository:

git clone https://github.com/VaeterchenFrost/tdvisu

To download only the latest version from the default branch:

git clone --depth 1 --single https://github.com/VaeterchenFrost/tdvisu

To install the project from the source folder:

pip install -h (for more information on install options)
pip install .

to confirm that the visualization finds all dependencies:

python ./tdvisu/visualization.py -h

to run all tests:

pip install .[test]
pytest ./test/

How to use

The visualization needs input in the form of the Json API. The creation of this file is exemplary implemented in construct_dpdb_visu.py or the fork GPUSAT and --visufile filename (optionally disabling preprocessing with -p).

Run the python file with the above dependencies installed: visualization.py

visualization.py takes two parameters, the json-infile to read from, and optionally one outputfolder. With both arguments a run might look like this:

python tdvisu/visualization.py visugpusat.json examplefolder

For #SAT it produces for example three different graphs suffixed with a running integer to represent timesteps:

  • TDStep the tree decomposition with solved nodes
  • PrimalGraphStep the primal graph with currently active variables highlighted
  • IncidenceGraphStep the bipartite incidence graph with active clauses/variables highlighted

The graphs are images encoded in resolution independent .svg files (see https://www.lifewire.com/svg-file-4120603)

How to use construct_dpdb_visu.py

After installing the project dp_on_dbs with the there listed requirements, we need to

  • edit the database.ini with our password to postgresql
  • Solve a problem with python dpdb.py [GENERAL-OPTIONS] -f <INPUT-FILE> <PROBLEM> [PROBLEM-SPECIFIC-OPTIONS]
    • for the problem VertexCover
      • with flag --gr-file to store the htd Input (if the input was in a different format)
    • for the problem SharpSat
      • with flag --store-formula to store the formula in the database
  • Run
    • Sat / SharpSat: python construct_dpdb_visu.py [PROBLEMNUMBER]
    • VertexCover: python construct_dpdb_visu.py [PROBLEMNUMBER] --twfile [TWFILE] with the file in DIMACS tw-format containing the edges of the graph.

Installation of the psycopg package

See https://www.psycopg.org/psycopg3/docs/basic/install.html

Note Whatever version of libpq psycopg is compiled with, it will be possible to connect to PostgreSQL servers of any supported version: just install the most recent libpq version or the most practical, without trying to match it to the version of the PostgreSQL server you will have to connect to.


New Release

Version

  • Bump /version.py according to the changes made
  • Change date to the release day, keep format

Requirements

In case dependencies have changed, or just to update some, check

  • requirements.txt
  • stable-requirements.txt (using pip freeze)
  • setup.py

Write Changelog.md

  • Add tag with link (see bottom for linking examples)
  • Add changes, maybe some are already in Unreleased
  • Update Unreleased with (No) unreleased changes

Review code

  • Run tests (pytest)
  • Check codestyle (pylint)

Push

  • Push changes to main
  • Wait for all automated checks! (All checks have passed...)

Create Release

  • On the GitHub page go to: Release, Draft a new release
  • Enter v'YOUR VERSION NUMBER' as the tag.
  • Add a Release Title (could be just the version)
  • Add some description (like in the CHANGELOG.md)
  • Click on Publish release on the bottom

Should automatically release to PyPI

Now you are set for the new release :tada:


Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tdvisu-1.2.0.tar.gz (72.5 kB view details)

Uploaded Source

Built Distribution

tdvisu-1.2.0-py3-none-any.whl (59.5 kB view details)

Uploaded Python 3

File details

Details for the file tdvisu-1.2.0.tar.gz.

File metadata

  • Download URL: tdvisu-1.2.0.tar.gz
  • Upload date:
  • Size: 72.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.12.8

File hashes

Hashes for tdvisu-1.2.0.tar.gz
Algorithm Hash digest
SHA256 e92ef56ea1889012b3f739180a53bb6ebce90162bca8e2e05e0063efa81060b8
MD5 8421eedfcf89d2e65aa0bb4f1ac6818e
BLAKE2b-256 7ebece36e6a7644c35925aa1d8b1e3e68642557172bfed14847fa78a8b615bcb

See more details on using hashes here.

File details

Details for the file tdvisu-1.2.0-py3-none-any.whl.

File metadata

  • Download URL: tdvisu-1.2.0-py3-none-any.whl
  • Upload date:
  • Size: 59.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.12.8

File hashes

Hashes for tdvisu-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a5486cefa52a43abd6b61697347945eb83f3db000ea451436d21cae42f2c4f7b
MD5 859b930f3612ff40ebbccdd37bbc3f32
BLAKE2b-256 9793d04156d03eb35dce354cc5c850598997777b47b63727396704a272b20723

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page