Skip to main content

Copula for multivariate dependence modeling

Project description

PyPi version Downloads License DOI

Overview

Pycop is the most complete tool for modeling multivariate dependence with Python. The package provides methods such as estimation, random sample generation, and graphical representation for commonly used copula functions. The package supports the use of mixture models defined as convex combinations of copulas. Other methods based on the empirical copula such as the non-parametric Tail Dependence Coefficient are given.

Some of the features covered:

  • Elliptical copulas (Gaussian & Student) and common Archimedean Copulas functions
  • Mixture model of multiple copula functions (up to 3 copula functions)
  • Multivariate random sample generation
  • Empirical copula method
  • Parametric and Non-parametric Tail Dependence Coefficient (TDC)

Available copula function

Copula Bivariate
Graph & Estimation
Multivariate
Simulation
Mixture
Gaussian
Student
Clayton
Rotated Clayton
Gumbel
Rotated Gumbel
Frank
Joe
Rotated Joe
Galambos
Rotated Galambos
BB1
BB2
FGM
Plackett
AMH

Usage

Install pycop using pip

pip install pycop

Examples

Open In Colab Estimations on msci returns

Open In Colab Graphical Representations

Open In Colab Simulations

Table of Contents

Graphical Representation

We first create a copula object by specifying the copula familly

from pycop import archimedean
cop = archimedean(family="clayton")

Plot the cdf and pdf of the copula.

3d plot

cop = archimedean(family="gumbel")

cop.plot_cdf([2], plot_type="3d", Nsplit=100 )
cop.plot_pdf([2], plot_type="3d", Nsplit=100, cmap="cividis" )

Contour plot

plot the contour

cop = archimedean(family="plackett")

cop.plot_cdf([2], plot_type="contour", Nsplit=100 )
cop.plot_pdf([2], plot_type="contour", Nsplit=100, )

It is also possible to add specific marginals

cop = archimedean.archimedean(family="clayton")

from scipy.stats import norm


marginals = [
    {
        "distribution": norm, "loc" : 0, "scale" : 0.8,
    },
    {
        "distribution": norm, "loc" : 0, "scale": 0.6,
    }]

cop.plot_mpdf([2], marginals, plot_type="3d",Nsplit=100,
            rstride=1, cstride=1,
            antialiased=True,
            cmap="cividis",
            edgecolor='black',
            linewidth=0.1,
            zorder=1,
            alpha=1)

lvls = [0.02, 0.05, 0.1, 0.2, 0.3]

cop.plot_mpdf([2], marginals, plot_type="contour", Nsplit=100,  levels=lvls)

Mixture plot

mixture of 2 copulas

from pycop import mixture

cop = mixture(["clayton", "gumbel"])
cop.plot_pdf([0.2, 2, 2],  plot_type="contour", Nsplit=40,  levels=[0.1,0.4,0.8,1.3,1.6] )
# plot with defined marginals
cop.plot_mpdf([0.2, 2, 2], marginals, plot_type="contour", Nsplit=50)

cop = mixture(["clayton", "gaussian", "gumbel"])
cop.plot_pdf([1/3, 1/3, 1/3, 2, 0.5, 4],  plot_type="contour", Nsplit=40,  levels=[0.1,0.4,0.8,1.3,1.6] )
cop.plot_mpdf([1/3, 1/3, 1/3, 2, 0.5, 2], marginals, plot_type="contour", Nsplit=50)

Simulation

Gaussian

from scipy.stats import norm
from pycop import simulation

n = 2 # dimension
m = 1000 # sample size

corrMatrix = np.array([[1, 0.8], [0.8, 1]])
u1, u2 = simulation.simu_gaussian(n, m, corrMatrix)

Adding gaussian marginals, (using distribution.ppf from scipy.statsto transform uniform margin to the desired distribution)

u1 = norm.ppf(u1)
u2 = norm.ppf(u2)

Student

u1, u2 = simulation.simu_tstudent(n, m, corrMatrix, nu=1)

Archimedean

List of archimedean cop available

u1, u2 = simulation.simu_archimedean("gumbel", n, m, theta=2)
u1, u2 = 1 - u1, 1 - u2

Rotated

u1, u2 = 1 - u1, 1 - u2

High dimension

n = 3       # Dimension
m = 1000    # Sample size

corrMatrix = np.array([[1, 0.9, 0], [0.9, 1, 0], [0, 0, 1]])
u = simulation.simu_gaussian(n, m, corrMatrix)
u = norm.ppf(u)

u = simulation.simu_archimedean("clayton", n, m, theta=2)
u = norm.ppf(u)

Mixture simulation

Simulation from a mixture of 2 copulas

n = 3
m = 2000

combination = [
    {"type": "clayton", "weight": 1/3, "theta": 2},
    {"type": "gumbel", "weight": 1/3, "theta": 3}
]

u = simulation.simu_mixture(n, m, combination)
u = norm.ppf(u)

Simulation from a mixture of 3 copulas

corrMatrix = np.array([[1, 0.8, 0], [0.8, 1, 0], [0, 0, 1]])


combination = [
    {"type": "clayton", "weight": 1/3, "theta": 2},
    {"type": "student", "weight": 1/3, "corrMatrix": corrMatrix, "nu":2},
    {"type": "gumbel", "weight": 1/3, "theta":3}
]

u = simulation.simu_mixture(n, m, combination)
u = norm.ppf(u)

Estimation

Estimation available : CMLE

Canonical Maximum Likelihood Estimation (CMLE)

Import a sample with pandas

import pandas as pd
import numpy as np

df = pd.read_csv("data/msci.csv")
df.index = pd.to_datetime(df["Date"], format="%m/%d/%Y")
df = df.drop(["Date"], axis=1)

for col in df.columns.values:
    df[col] = np.log(df[col]) - np.log(df[col].shift(1))

df = df.dropna()
from pycop import estimation, archimedean

cop = archimedean("clayton")
data = df[["US","UK"]].T.values
param, cmle = estimation.fit_cmle(cop, data)

clayton estim: 0.8025977727691012

Tail Dependence coefficient

Theoretical TDC

from pycop import archimedean

cop = archimedean("clayton")

cop.LTDC(theta=0.5)
cop.UTDC(theta=0.5)

For a mixture copula, the copula with lower tail dependence comes first, and the one with upper tail dependence is last.

from pycop import mixture

cop = mixture(["clayton", "gaussian", "gumbel"])

LTDC = cop.LTDC(weight = 0.2, theta = 0.5) 
UTDC = cop.UTDC(weight = 0.2, theta = 1.5) 

Non-parametric TDC

Create an empirical copula object

from pycop import empirical

cop = empirical(df[["US","UK"]].T.values)

Compute the non-parametric Upper TDC (UTDC) or the Lower TDC (LTDC) for a given threshold:

cop.LTDC(0.01) # i/n = 1%
cop.UTDC(0.99) # i/n = 99%

Optimal Empirical TDC

Returns the optimal non-parametric TDC based on the heuristic plateau-finding algorithm from Frahm et al (2005) "Estimating the tail-dependence coefficient: properties and pitfalls"

cop.optimal_tdc("upper")
cop.optimal_tdc("lower")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pycop-0.0.13.tar.gz (20.4 kB view details)

Uploaded Source

Built Distribution

pycop-0.0.13-py3-none-any.whl (22.0 kB view details)

Uploaded Python 3

File details

Details for the file pycop-0.0.13.tar.gz.

File metadata

  • Download URL: pycop-0.0.13.tar.gz
  • Upload date:
  • Size: 20.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.9.19

File hashes

Hashes for pycop-0.0.13.tar.gz
Algorithm Hash digest
SHA256 c52e89dc3c003f1f42b5e7820a4a0728cc130b3ae57b46eeeb871f20398847e1
MD5 03fad684917de0d62df8252e0f72eb11
BLAKE2b-256 69eeb7ea9043a60f5f91ce254313d587aa9d6a17f3bd0e93be9fcb74f9d7102c

See more details on using hashes here.

File details

Details for the file pycop-0.0.13-py3-none-any.whl.

File metadata

  • Download URL: pycop-0.0.13-py3-none-any.whl
  • Upload date:
  • Size: 22.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.9.19

File hashes

Hashes for pycop-0.0.13-py3-none-any.whl
Algorithm Hash digest
SHA256 7dbb71f4547d4a5f6cf81ff06e76ae0a2a732d6a672abb2b0d38d8be7546d71f
MD5 71faaae32caf7221737738854902efa8
BLAKE2b-256 20ca673e8cf3ca7a7a0bb3aa42b5ba58191c32c0784dfaa56a1e6993c19079fb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page