Skip to main content

Parallel, distributed NumPy-like arrays backed by Chapel

Project description

Arkouda logo

Arkouda (αρκούδα): NumPy-like arrays at massive scale backed by Chapel.

NOTE: Arkouda is under the MIT license.

Online Documentation

Arkouda Online Documentation

Arkouda PDF Documentation

Arkouda docs at Github Pages

Nightly Arkouda Performance Charts

Arkouda nightly performance charts

Gitter channels

Arkouda Gitter channel

Chapel Gitter channel

Talks on Arkouda

Bill Reus' March 2021 talk at the NJIT Data Science Seminar

Bill Reus' CHIUW 2020 Keynote video and slides

Mike Merrill's CHIUW 2019 talk

Bill Reus' CLSAC 2019 talk

(PAW-ATM) talk and abstract


Exploratory data analysis (EDA) is a prerequisite for all data science, as illustrated by the ubiquity of Jupyter notebooks, the preferred interface for EDA among data scientists. The operations involved in exploring and transforming the data are often at least as computationally intensive as downstream applications (e.g. machine learning algorithms), and as datasets grow, so does the need for HPC-enabled EDA. However, the inherently interactive and open-ended nature of EDA does not mesh well with current HPC usage models. Meanwhile, several existing projects from outside the traditional HPC space attempt to combine interactivity and distributed computation using programming paradigms and tools from cloud computing, but none of these projects have come close to meeting our needs for high-performance EDA.

To fill this gap, we have developed a software package, called Arkouda, which allows a user to interactively issue massively parallel computations on distributed data using functions and syntax that mimic NumPy, the underlying computational library used in the vast majority of Python data science workflows. The computational heart of Arkouda is a Chapel interpreter that accepts a pre-defined set of commands from a client (currently implemented in Python) and uses Chapel's built-in machinery for multi-locale and multithreaded execution. Arkouda has benefited greatly from Chapel's distinctive features and has also helped guide the development of the language.

In early applications, users of Arkouda have tended to iterate rapidly between multi-node execution with Arkouda and single-node analysis in Python, relying on Arkouda to filter a large dataset down to a smaller collection suitable for analysis in Python, and then feeding the results back into Arkouda computations on the full dataset. This paradigm has already proved very fruitful for EDA. Our goal is to enable users to progress seamlessly from EDA to specialized algorithms by making Arkouda an integration point for HPC implementations of expensive kernels like FFTs, sparse linear algebra, and graph traversal. With Arkouda serving the role of a shell, a data scientist could explore, prepare, and call optimized HPC libraries on massive datasets, all within the same interactive session.

Arkouda is not trying to replace Pandas but to allow for some Pandas-style operation at a much larger scale. In our experience Pandas can handle dataframes up to about 500 million rows before performance becomes a real issue, this is provided that you run on a sufficently capable compute server. Arkouda breaks the shared memory paradigm and scales its operations to dataframes with over 200 billion rows, maybe even a trillion. In practice we have run Arkouda server operations on columns of one trillion elements running on 512 compute nodes. This yielded a >20TB dataframe in Arkouda.

Table of Contents

  1. Prerequisites
  2. Building Arkouda
  3. Testing Arkouda
  4. Installing Arkouda Python libs and deps
  5. Running arkouda_server
  6. Logging
  7. Type Checking in Arkouda
  8. Environment Variables
  9. Deploying Arkouda in Containers
  10. Contributing

Prerequisites toc

Requirements: toc

  • requires chapel 1.23.0
  • requires zeromq version >= 4.2.5, tested with 4.2.5 and 4.3.1
  • requires hdf5
  • requires python 3.7 or greater
  • requires numpy
  • requires typeguard for runtime type checking
  • requires pandas for testing and conversion utils
  • requires pytest, pytest-env, and h5py to execute the Python test harness
  • requires sphinx, sphinx-argparse, and sphinx-autoapi to generate docs

MacOS Environment toc

Installing Chapel toc

Option 1: Setup using brew

(click to see more)
brew install zeromq

brew install hdf5

brew install chapel

Option 2: Build Chapel from source

(click to see more)
# build chapel in the user home directory with these settings...
export CHPL_HOME=~/chapel/chapel-1.23.0
source $CHPL_HOME/util/setchplenv.bash
export CHPL_COMM=gasnet
export CHPL_TARGET_CPU=native

# Build chpldoc to enable generation of Arkouda docs
make chpldoc

# Add the Chapel and Chapel Doc executables (chpl and chpldoc, respectiveley) to 
# PATH either in ~/.bashrc (single user) or /etc/environment (all users):

export PATH=$CHPL_HOME/bin/linux64-x86_64/:$PATH

Mac - Python / Anaconda toc

While not required, it is highly recommended to install Anaconda to provide a Python3 environment and manage Python dependencies. Otherwise, python can be installed via brew.

# The recommended Python install is via Anaconda:
source ~/.bashrc

# Otherwise, Python 3 can be installed with brew
brew install python3

# these packages are nice but not a requirement (manual install required if Python installed with brew)
pip3 install pandas
pip3 install jupyter

Linux Environment toc

Installing Chapel on Linux toc

There is no Linux Chapel install, so the first two steps in the Linux Arkouda install are to install the Chapel dependencies followed by downloading and building Chapel.

(click to see more)
# Update Linux kernel and install Chapel dependencies
sudo apt-get update
sudo apt-get install gcc g++ m4 perl python python-dev python-setuptools bash make mawk git pkg-config

# Download latest Chapel release, explode archive, and navigate to source root directory
tar xvf chapel-1.23.0.tar.gz
cd chapel-1.23.0/


# Add chpl to PATH
source $CHPL_HOME/util/setchplenv.bash

# Set remaining env variables and execute make
export CHPL_COMM=gasnet
export CHPL_TARGET_CPU=native

# Build chpldoc to enable generation of Arkouda docs
make chpldoc

# Optionally add the Chapel executable (chpl) to the PATH for all users: /etc/environment
export PATH=$CHPL_HOME/bin/linux64-x86_64/:$PATH

Python environment setup - Anaconda toc

As is the case with the MacOS install, it is highly recommended to install Anaconda to provide a Python environment and manage Python dependencies:

(click to see more)
 source ~/.bashrc

Building Arkouda toc

Download, clone, or fork the arkouda repo. Further instructions assume that the current directory is the top-level directory of the repo.

Build the source toc

If your environment requires non-system paths to find dependencies (e.g., if using the ZMQ and HDF5 bundled with [Anaconda]), append each path to a new file Makefile.paths like so:

# Makefile.paths

# Custom Anaconda environment for Arkouda
$(eval $(call add-path,/home/user/anaconda3/envs/arkouda))
#                      ^ Note: No space after comma.

The chpl compiler will be executed with -I, -L and an -rpath to each path.

# If zmq and hdf5 have not been installed previously, execute make install-deps
make install-deps

# Run make to build the arkouda_server executable

Building the Arkouda documentation toc

The Arkouda documentation is hosted on Read-the-Docs.

(click to see more)

First ensure that all Python doc dependencies including sphinx and sphinx extensions have been installed as detailed above. Important: if Chapel was built locally, make chpldoc must be executed as detailed above to enable generation of the Chapel docs via the chpldoc executable.

Now that all doc generation dependencies for both Python and Chapel have been installed, there are three make targets for generating docs:

# make doc-python generates the Python docs only
make doc-python

# make doc-server generates the Chapel docs only
make doc-server

# make doc generates both Python and Chapel documentation
make doc

The Python docs are written out to the arkouda/docs directory while the Chapel docs are exported to the arkouda/docs/server directory.

arkouda/docs/ # Python frontend documentation
arkouda/docs/server # Chapel backend server documentation 

To view the Arkouda documentation locally, type the following url into the browser of choice: file:///path/to/arkouda/docs/index.html, substituting the appropriate path for the Arkouda directory configuration.

The make doc target detailed above prepares the Arkouda Python and Chapel docs for hosting both locally and on Read-the-Docs.

There are three easy steps to hosting Arkouda docs on Github Pages. First, the Arkouda docs generated via make doc are pushed to the Arkouda or Arkouda fork master branch. Next, navigate to the Github project home and click the "Settings" tab. Finally, scroll down to the Github Pages section and select the "master branch docs/ folder" source option. The Github Pages docs url will be displayed once the source option is selected. Click on the link and the Arkouda documentation homepage will be displayed.

Testing Arkouda toc

(click to see more)

There are two unit test suites for Arkouda, one for Python and one for Chapel. As mentioned above, the Arkouda
Python test harness leverages multiple libraries such as pytest and pytest-env that must be installed via pip3 install -e .[dev], whereas the Chapel test harness does not require any external librares.

The default Arkouda test executes the Python test harness and is invoked as follows:

make test

The Chapel unit tests can be executed as follows:

make test-chapel

Both the Python and Chapel unit tests are executed as follows:

make test-all

For more details regarding Arkouda testing, please consult the Python test README and Chapel test README, respectively.

Installing the Arkouda Python Library and Dependencies toc

Now that the arkouda_server is built and tested, install the Python library.

The Arkouda Python library along with it's dependent libraries are installed with pip. There are four types of Python dependencies for the Arkouda developer to install: requires, dev, test, and doc. The required libraries, which are the runtime dependencies of the Arkouda python library, are installed as follows:

 pip3 install -e .

Arkouda and the Python libaries required for development, test, and doc generation activities are installed as follows:

pip3 install -e .[dev]

Running arkouda_server toc

The command-line invocation depends on whether you built a single-locale version (with CHPL_COMM=none) or multi-locale version (with CHPL_COMM set to the desired number of locales).

Single-locale startup:


Multi-locale startup (user selects the number of locales):

./arkouda_server -nl 2

Memory tracking is turned on by default now, you can run server with memory tracking turned off by

./arkouda_server --memTrack=false

By default, the server listens on port 5555. This value can be overridden with the command-line flag --ServerPort=1234

Memory tracking is turned on by default and turned off by using the --memTrack=false flag

Trace logging messages are turned on by default and turned off by using the --trace=false flag

Other command line options are available and can be viewed by using the --help flag

./arkouda-server --help

Sanity check arkouda_server toc

To sanity check the arkouda server, you can run

make check

This will start the server, run a few computations, and shut the server down. In addition, the check script can be executed against a running server by running the following Python command:

python3 tests/ localhost 5555

Token-Based Authentication in Arkouda toc

Arkouda features a token-based authentication mechanism analogous to Jupyter, where a randomized alphanumeric string is generated or loaded at arkouda_server startup. The command to start arkouda_server with token authentication is as follows:

./arkouda_server --authenticate

The generated token is saved to the tokens.txt file which is contained in the .arkouda directory located in the same working directory the arkouda_server is launched from. The arkouda_server will re-use the same token until the .arkouda/tokens.txt file is removed, which forces arkouda_server to generate a new token and corresponding tokens.txt file.

Connecting to Arkouda toc

The client connects to the arkouda_server either by supplying a host and port or by providing a connect_url connect string:

arkouda.connect(server='localhost', port=5555)

When arkouda_server is launched in authentication-enabled mode, clients connect by either specifying the access_token parameter or by adding the token to the end of the connect_url connect string:

arkouda.connect(server='localhost', port=5555, access_token='dcxCQntDQllquOsBNjBp99Pu7r3wDJn')

Note: once a client has successfully connected to an authentication-enabled arkouda_server, the token is cached in the user's $ARKOUDA_HOME .arkouda/tokens.txt file. As long as the arkouda_server token remains the same, the user can connect without specifying the token via the access_token parameter or token url argument.

Logging toc

The Arkouda server features a Chapel logging framework that prints out the module name, routine name and line number for all logged messages. Available logging levels are ERROR, CRITICAL, WARN, INFO, and DEBUG.

The default logging level is INFO where all messages at the ERROR, CRITICAL, WARN, and INFO levels are printed. For debugging, the DEBUG level is enabled by passing in the --v flag upon arkouda_server startup.

Type Checking in Arkouda toc

Both static and runtime type checking are becoming increasingly popular in Python, especially for large Python code bases such as those found at dropbox. Arkouda uses mypy for static type checking and typeguard for runtime type checking.

(click to see more)

Enabling runtime as well as static type checking in Python starts with adding type hints, as shown below to a method signature:

def connect(server : str="localhost", port : int=5555, timeout : int=0, 
                           access_token : str=None, connect_url=None) -> None:

mypy static type checking can be invoked either directly via the mypy command or via make:

$ mypy arkouda
Success: no issues found in 16 source files
$ make mypy
python3 -m mypy arkouda
Success: no issues found in 16 source files

Runtime type checking is enabled at the Python method level by annotating the method if interest with the @typechecked decorator, an example of which is shown below:

def save(self, prefix_path : str, dataset : str='array', mode : str='truncate') -> str:

Type checking in Arkouda is implemented on an "opt-in" basis. Accordingly, Arkouda continues to support duck typing for parts of the Arkouda API where type checking is too confining to be useful. As detailed above, both runtime and static type checking require type hints. Consequently, to opt-out of type checking, simply leave type hints out of any method declarations where duck typing is desired.

Environment Variables toc

The various Arkouda aspects (compilation, run-time, client, tests, etc.) can be configured using a number of environment variables (env vars). See the ENVIRONMENT documentation for more details.

Containerized Arkouda toc

Deployment of Arkouda in containers via orchestration frameworks such as Kubernetes is currently under investigation.

Dockerfiles for the Arkouda server and client are arkouda-server-docker and arkouda-client-docker, respectively, both of which are located in the project root directory.

The hokiegeek2 fork server and client docker images are available on dockerhub.

The Arkouda server can be deployed in a container runtime such as docker or containerd via the following CLI arguments:

docker run -it --rm -p 5555:5555 -e NUMLOCALES=3 -e MEMTRACK=true -e AUTHENTICATE=false -e VERBOSE=true hokiegeek2/arkouda-server:0.0.5 

Similarly, the Arkouda client docker can also be deployed in a container runtime:

docker run -it --rm hokiegeek2/arkouda-client:0.0.5

The Arkouda server can be deployed to Kubernetes via the Arkouda server Helm chart, which is located in the $PROJECT_HOME/arkouda-server-chart folder. The Arkouda helm chart is installed as follows:

helm install -n arkouda arkouda-server-deployment arkouda-server-chart/

In a Kubernetes environment, the Arkouda client is normally installed via pip within a Jupyter notebook deployed in Jupyterhub or Dask-Jupyter.

Contributing to Arkouda toc

If you'd like to contribute, please see

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for hokiegeek2-arkouda, version 2021.4.1
Filename, size File type Python version Upload date Hashes
Filename, size hokiegeek2_arkouda-2021.4.1-py3-none-any.whl (78.0 kB) File type Wheel Python version py3 Upload date Hashes View

Supported by

AWS AWS Cloud computing Datadog Datadog Monitoring DigiCert DigiCert EV certificate Facebook / Instagram Facebook / Instagram PSF Sponsor Fastly Fastly CDN Google Google Object Storage and Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Salesforce Salesforce PSF Sponsor Sentry Sentry Error logging StatusPage StatusPage Status page