Skip to main content

Causal discovery made easy.

Project description

[!WARNING] causy is currently in a very early and experimental stage of development. It currently only supports one algorithm. We do not recommend using it in production.

causy

Causal discovery made easy. Causy allows you to use and implement causal discovery algorithms with easy to use, extend and maintain pipelines. It is built based on pytorch which allows you to run the algorithms on CPUs as well as GPUs seamlessly.

Background

Current causal discovery algorithms are often designed for the primary purpose of research. They are often implemented in a monolithic way, which makes it hard to understand and extend them. Causy aims to solve this problem by providing a framework which allows you to easily implement and use causal discovery algorithms by splitting them up into smaller logic steps which can be stacked together to form a pipeline. This allows you to easily understand, extend, optimize, and experiment with the algorithms.

By shipping causy with sensitively configured default pipelines, we also aim to provide a tool that can be used by non-experts to get started with causal discovery.

Thanks to the pytorch backend, causy is remarkably faster compared to serial CPU based implementations.

In the future, causy aims to provide interactive visualizations which allow you to understand the causal discovery process.

Installation

Currently we only support python 3.11. To install causy run

pip install causy

Usage

Causy can be used via CLI or via code.

Usage via CLI

Run causy with one of the default algorithms

causy execute --help
causy execute your_data.json --algorithm PC --output-file output.json

The input data should be a json file with a list of dictionaries. Each dictionary represents a data point. The keys of the dictionary are the variable names and the values are the values of the variables. The values can be either numeric or categorical.

[
    {"a": 1, "b": 0.3},
    {"a": 0.5, "b": 0.2}
]

You can customize your causy pipeline by ejecting and modifying the pipeline file.

causy eject PC pc.json
# edit pc.json
causy execute your_data.json --pipeline pc.json

This might be useful if you want to configure a custom algorithm or if you want to customize the pipeline of a default algorithm.

Usage via Code

Use a default algorithm

from causy.algorithms import PC
from causy.utils import retrieve_edges

model = PC()
model.create_graph_from_data(
    [
        {"a": 1, "b": 0.3},
        {"a": 0.5, "b": 0.2}
    ]
)
model.create_all_possible_edges()
model.execute_pipeline_steps()
edges = retrieve_edges(model.graph)

for edge in edges:
    print(
        f"{edge[0].name} -> {edge[1].name}: {model.graph.edges[edge[0]][edge[1]]}"
    )

Use a custom algorithm

from causy.exit_conditions import ExitOnNoActions
from causy.graph import graph_model_factory, Loop
from causy.independence_tests import (
    CalculateCorrelations,
    CorrelationCoefficientTest,
    PartialCorrelationTest,
    ExtendedPartialCorrelationTestMatrix,
)
from causy.orientation_tests import (
    ColliderTest,
    NonColliderTest,
    FurtherOrientTripleTest,
    OrientQuadrupleTest,
    FurtherOrientQuadrupleTest,
)
from causy.utils import retrieve_edges

CustomPC = graph_model_factory(
    pipeline_steps=[
        CalculateCorrelations(),
        CorrelationCoefficientTest(threshold=0.1),
        PartialCorrelationTest(threshold=0.01),
        ExtendedPartialCorrelationTestMatrix(threshold=0.01),
        ColliderTest(),
        Loop(
            pipeline_steps=[
                NonColliderTest(),
                FurtherOrientTripleTest(),
                OrientQuadrupleTest(),
                FurtherOrientQuadrupleTest(),
            ],
            exit_condition=ExitOnNoActions(),
        ),
    ]
)

model = CustomPC()

model.create_graph_from_data(
    [
        {"a": 1, "b": 0.3},
        {"a": 0.5, "b": 0.2}
    ]
)
model.create_all_possible_edges()
model.execute_pipeline_steps()
edges = retrieve_edges(model.graph)

for edge in edges:
    print(
        f"{edge[0].name} -> {edge[1].name}: {model.graph.edges[edge[0]][edge[1]]}"
    )

Supported algorithms

Currently causy supports the following algorithms:

  • PC (Peter-Clark)
    • PC - the original PC algorithm without any modifications causy.algorithms.PC
    • ParallelPC - a parallelized version of the PC algorithm causy.algorithms.ParallelPC

Supported pipeline steps

Detailed information about the pipeline steps can be found in the API Documentation.

Dev usage

Setup

We recommend using poetry to manage the dependencies. To install poetry follow the instructions on https://python-poetry.org/docs/#installation.

Install dependencies

poetry install

Execute tests

poetry run python -m unittest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

causy-0.0.19.tar.gz (20.4 kB view details)

Uploaded Source

Built Distribution

causy-0.0.19-py3-none-any.whl (24.8 kB view details)

Uploaded Python 3

File details

Details for the file causy-0.0.19.tar.gz.

File metadata

  • Download URL: causy-0.0.19.tar.gz
  • Upload date:
  • Size: 20.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for causy-0.0.19.tar.gz
Algorithm Hash digest
SHA256 2a1cd7bc1fb36606384d4e963d1288efe979bf4461487ab6d1d0b6dc807034d5
MD5 515a8f7e998c44b390f083e3e6d53d20
BLAKE2b-256 c5ac3458bd8cbd96939e0dbbc352c61cdf4c8465e101ec9d9502fb0e073b94dc

See more details on using hashes here.

File details

Details for the file causy-0.0.19-py3-none-any.whl.

File metadata

  • Download URL: causy-0.0.19-py3-none-any.whl
  • Upload date:
  • Size: 24.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for causy-0.0.19-py3-none-any.whl
Algorithm Hash digest
SHA256 c27aa39280700b788343250b6174817f03f73e1cbd00512d1bf00036c4c10a62
MD5 69a8ecce7b48f3e2e450426286a31594
BLAKE2b-256 7ead569d0f4255dc35e53cc4faa65776284af7d8afe0cb28ed85189036fbfa55

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page