# Product of copulas

A few days ago I wrote a post about copulas and operations on them that have a group structure. Here’s another example of group structure for copulas. As in the previous post I’m just looking at two-dimensional copulas to keep things simple.

Given two copulas C1 and C2, you can define a sort of product between them by

Here Di is the partial derivative with respect to the ith variable.

The product of two copulas is another copula. This product is associative but not commutative. There is an identity element, so copulas with this product form a semigroup.

The identity element is the copula

that is,

for any copula C.

The copula M is important because it is the upper bound for the Fréchet-Hoeffding bounds: For any copula C,

There is also a sort of null element for our semigroup, and that is the independence copula

It’s called the independence copula because it’s the copula for two independent random variables: their joint CDF is the product of their individual CDFs. It acts like a null element because

This tells us we have a semigroup and not a group: the independence copula cannot have an inverse.

Reference: Roger B. Nelsen. An Introduction to Copulas.

## 8 thoughts on “Product of copulas”

1. Nathan Hannon

This is interesting because it looks a lot like a ring, but isn’t one. The multiplication operation distributes over ordinary addition, but the sum of two copulas isn’t a copula, and the element that behaves like 0 with respect to multiplication looks nothing like the additive identity.

2. Is there also an operator for which the lower Fréchet-Hoeffding bound is the identity?

3. Good question. Nelsen does give some equations involving W, the Fréchet-Hoeffding lower bound copula.

For one, W*W = M, so W is a sort of square root of M. Also, W*C*W gives the survival copula associated with C. So the ^ operator that takes a copula to its survival copula is a sort of conjugation.

4. Felipe

Is there a probabilistic interpretation of the resulting random pair?

Something similar to the fact that the sum two independent random variables has as distribution the convolution of the distributions of the summands.

5. @Felipe: Yes. For a Markov process and times s < u < t, the copula for X_s and X_t is the convolution product of the copulas for X_s to X_u and X_u to X_t. See Nelsen Theorem 6.4.3. Nelsen cites WF Darsow et al. Copulas and Markov processes. Illinois J Math 36:600-642.

6. Sounds like W is a kind of -1? At least it seems that W*Π = Π. Then we could define -C as W*C, but that would only make sense it W*C = C*W and I’m not sure about that.

7. Felipe

@John

Thanks for the reference!

8. It seems that

(W * C)(u, v) = v – C(1 – u, v)

and

(C * W)(u, v) = u – C(u, 1 – v).