Skip to main content

Causal discovery made easy.

Project description

[!WARNING] causy is currently in a very early and experimental stage of development. It currently only supports one algorithm. We do not recommend using it in production.

causy

Causal discovery made easy. Causy allows you to use and implement causal discovery algorithms with easy to use, extend and maintain pipelines. It is built based on pytorch which allows you to run the algorithms on CPUs as well as GPUs seamlessly.

Background

Current causal discovery algorithms are often designed for the primary purpose of research. They are often implemented in a monolithic way, which makes it hard to understand and extend them. Causy aims to solve this problem by providing a framework which allows you to easily implement and use causal discovery algorithms by splitting them up into smaller logic steps which can be stacked together to form a pipeline. This allows you to easily understand, extend, optimize, and experiment with the algorithms.

By shipping causy with sensitively configured default pipelines, we also aim to provide a tool that can be used by non-experts to get started with causal discovery.

Thanks to the pytorch backend, causy is remarkably faster compared to serial CPU based implementations.

In the future, causy aims to provide interactive visualizations which allow you to understand the causal discovery process.

Installation

Currently we only support python 3.11. To install causy run

pip install causy

Usage

Causy can be used via CLI or via code.

Usage via CLI

Run causy with one of the default algorithms

causy execute --help
causy execute your_data.json --algorithm PC --output-file output.json

The input data should be a json file with a list of dictionaries. Each dictionary represents a data point. The keys of the dictionary are the variable names and the values are the values of the variables. The values can be either numeric or categorical.

[
    {"a": 1, "b": 0.3},
    {"a": 0.5, "b": 0.2}
]

You can customize your causy pipeline by ejecting and modifying the pipeline file.

causy eject PC pc.json
# edit pc.json
causy execute your_data.json --pipeline pc.json

This might be useful if you want to configure a custom algorithm or if you want to customize the pipeline of a default algorithm.

Usage via Code

Use a default algorithm

from causy.algorithms import PC
from causy.utils import retrieve_edges

model = PC()
model.create_graph_from_data(
    [
        {"a": 1, "b": 0.3},
        {"a": 0.5, "b": 0.2}
    ]
)
model.create_all_possible_edges()
model.execute_pipeline_steps()
edges = retrieve_edges(model.graph)

for edge in edges:
    print(
        f"{edge[0].name} -> {edge[1].name}: {model.graph.edges[edge[0]][edge[1]]}"
    )

Use a custom algorithm

from causy.exit_conditions import ExitOnNoActions
from causy.graph import graph_model_factory, Loop
from causy.independence_tests import (
    CalculateCorrelations,
    CorrelationCoefficientTest,
    PartialCorrelationTest,
    ExtendedPartialCorrelationTestMatrix,
)
from causy.orientation_tests import (
    ColliderTest,
    NonColliderTest,
    FurtherOrientTripleTest,
    OrientQuadrupleTest,
    FurtherOrientQuadrupleTest,
)
from causy.utils import retrieve_edges

CustomPC = graph_model_factory(
    pipeline_steps=[
        CalculateCorrelations(),
        CorrelationCoefficientTest(threshold=0.1),
        PartialCorrelationTest(threshold=0.01),
        ExtendedPartialCorrelationTestMatrix(threshold=0.01),
        ColliderTest(),
        Loop(
            pipeline_steps=[
                NonColliderTest(),
                FurtherOrientTripleTest(),
                OrientQuadrupleTest(),
                FurtherOrientQuadrupleTest(),
            ],
            exit_condition=ExitOnNoActions(),
        ),
    ]
)

model = CustomPC()

model.create_graph_from_data(
    [
        {"a": 1, "b": 0.3},
        {"a": 0.5, "b": 0.2}
    ]
)
model.create_all_possible_edges()
model.execute_pipeline_steps()
edges = retrieve_edges(model.graph)

for edge in edges:
    print(
        f"{edge[0].name} -> {edge[1].name}: {model.graph.edges[edge[0]][edge[1]]}"
    )

Supported algorithms

Currently causy supports the following algorithms:

  • PC (Peter-Clark)
    • PC - the original PC algorithm without any modifications causy.algorithms.PC
    • ParallelPC - a parallelized version of the PC algorithm causy.algorithms.ParallelPC

Supported pipeline steps

Detailed information about the pipeline steps can be found in the API Documentation.

Dev usage

Setup

We recommend using poetry to manage the dependencies. To install poetry follow the instructions on https://python-poetry.org/docs/#installation.

Install dependencies

poetry install

Execute tests

poetry run python -m unittest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

causy-0.0.15.tar.gz (20.2 kB view details)

Uploaded Source

Built Distribution

causy-0.0.15-py3-none-any.whl (24.6 kB view details)

Uploaded Python 3

File details

Details for the file causy-0.0.15.tar.gz.

File metadata

  • Download URL: causy-0.0.15.tar.gz
  • Upload date:
  • Size: 20.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for causy-0.0.15.tar.gz
Algorithm Hash digest
SHA256 2cd67c9345dcd44d1c5dc7c9ac4a7e8fae17d077e55baf5f9dd75852718a104f
MD5 c0507a994c3bbda8d6c72cd3c4f1cb2b
BLAKE2b-256 8a98f89daf35bb02ead3c6953c6d96beb1a5ee5b6a6e37853309e38a20c69abc

See more details on using hashes here.

File details

Details for the file causy-0.0.15-py3-none-any.whl.

File metadata

  • Download URL: causy-0.0.15-py3-none-any.whl
  • Upload date:
  • Size: 24.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/4.0.2 CPython/3.11.6

File hashes

Hashes for causy-0.0.15-py3-none-any.whl
Algorithm Hash digest
SHA256 e951183fbab328c4d7e279880d6e5450a4e6bdfce9558c04ade5204edd06a271
MD5 dada7ab6e70f10669f97a443b564bd62
BLAKE2b-256 d5e4a48a02ebb0c3bc865b6beeb0bd4538937950dbe9ae2d148986bdab684ccf

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page