Skip to main content

Data-Centric Parallel Programming Framework

Project description

General Tests GPU Tests FPGA Tests Documentation Status PyPI version codecov

DaCe - Data-Centric Parallel Programming

Decoupling domain science from performance optimization.

DaCe is a fast parallel programming framework that takes code in Python/NumPy and other programming languages, and maps it to high-performance CPU, GPU, and FPGA programs, which can be optimized to achieve state-of-the-art. Internally, DaCe uses the Stateful DataFlow multiGraph (SDFG) data-centric intermediate representation: A transformable, interactive representation of code based on data movement. Since the input code and the SDFG are separate, it is possible to optimize a program without changing its source, so that it stays readable. On the other hand, transformations are customizable and user-extensible, so they can be written once and reused in many applications. With data-centric parallel programming, we enable direct knowledge transfer of performance optimization, regardless of the application or the target processor.

DaCe generates high-performance programs for:

  • Multi-core CPUs (tested on Intel, IBM POWER9, and ARM with SVE)
  • NVIDIA GPUs and AMD GPUs (with HIP)
  • Xilinx and Intel FPGAs

DaCe can be written inline in Python and transformed in the command-line/Jupyter Notebooks or SDFGs can be interactively modified using our Visual Studio Code extension.

For more information, see the documentation

Quick Start

Install DaCe with pip: pip install dace

Having issues? See our full Installation and Troubleshooting Guide.

Using DaCe in Python is as simple as adding a @dace decorator:

import dace
import numpy as np

@dace
def myprogram(a):
    for i in range(a.shape[0]):
        a[i] += i
    return np.sum(a)

Calling myprogram with any NumPy array or GPU array (e.g., PyTorch, Numba, CuPy) will generate data-centric code, compile, and run it. From here on out, you can optimize (interactively or automatically), instrument, and distribute your code. The code creates a shared library (DLL/SO file) that can readily be used in any C ABI compatible language (C/C++, FORTRAN, etc.).

For more information on how to use DaCe, see the samples or tutorials below:

Publication

The paper for the SDFG IR can be found here. Other DaCe-related publications are available on our website.

If you use DaCe, cite us:

@inproceedings{dace,
  author    = {Ben-Nun, Tal and de~Fine~Licht, Johannes and Ziogas, Alexandros Nikolaos and Schneider, Timo and Hoefler, Torsten},
  title     = {Stateful Dataflow Multigraphs: A Data-Centric Model for Performance Portability on Heterogeneous Architectures},
  year      = {2019},
  booktitle = {Proceedings of the International Conference for High Performance Computing, Networking, Storage and Analysis},
  series = {SC '19}
}

Contributing

DaCe is an open-source project. We are happy to accept Pull Requests with your contributions! Please follow the contribution guidelines before submitting a pull request.

License

DaCe is published under the New BSD license, see LICENSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dace-1.0.0.tar.gz (5.8 MB view details)

Uploaded Source

File details

Details for the file dace-1.0.0.tar.gz.

File metadata

  • Download URL: dace-1.0.0.tar.gz
  • Upload date:
  • Size: 5.8 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.7.6

File hashes

Hashes for dace-1.0.0.tar.gz
Algorithm Hash digest
SHA256 7accc68139898ab788d8126c2051ef1559ab56aa4d19439b89632d843cbe2ed7
MD5 2e91a850cf1719c7c5b974974c58a01c
BLAKE2b-256 f350ded7cbbe9eccba2b5a27444bf8951117c80d803c4741def6380fb6f063a1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page