Extreme Value Properties of Multivariate t Copulas

3 downloads 0 Views 210KB Size Report
normal distribution with correlation matrix R, and a t copula is obtained via (1.1) from a .... exp{n−1 log uj} ≈ 1 + n−1 log uj, so that with ˜uj = −log uj, ..... Fd,Σ denote the distribution function of a d-dimensional Cauchy distribution with dispersion.
Department of Mathematics

Technical Report 2008-2

Extreme Value Properties of Multivariate t Copulas Aristidis K. Nikoloulopoulos, Harry Joe, Haijun Li April 2008

Postal address: Department of Mathematics, Washington State University, Pullman, WA 99164-3113 Voice: 509-335-3926 • Fax: 509-335-1188 • Email: [email protected] • URL: http://www.sci.wsu.edu/math

Extreme Value Properties of Multivariate t Copulas Aristidis K. Nikoloulopoulos



Harry Joe



Haijun Li





Abstract The extremal dependence behavior of t copulas is examined and their extreme value limiting copulas, called the t-EV copulas, are derived explicitly using the tail dependence functions. As two special cases, the H¨ usler-Reiss and the Marshall-Olkin distributions emerge as limits of the t-EV copula as the degrees of freedom go to infinity and zero respectively. The t copula and its extremal variants attain a wide range in the set of bivariate tail dependence parameters. Mathematics Subject Classification: 62H20, 91B30 Key Words and Phrases: Tail dependence function, extreme value, t copula.

1

Introduction

It has been well documented (see, for example, Demarta and McNeil, 2005; McNeil et al., 2005) that the t copulas are generally superior to the Gaussian copulas in the context of modeling multivariate financial return data, due in great part to the ability of t copulas to capture the tail dependence among extreme values. This paper is motivated from studying the range of the set of bivariate tail dependence parameters for t copulas and showing that that sharp bounds for a triplet of bivariate tail dependence parameters are almost always reached. Let F denote the distribution function of a random vector X = (X 1 , . . . , Xd ) with continuous margins F1 , . . . , Fd . The copula C of F can be uniquely expressed as C(u1 , . . . , ud ) = F (F1−1 (u1 ), . . . , Fd−1 (ud )), (u1 , . . . , un ) ∈ [0, 1]d ,

(1.1)

where Fi−1 , 1 ≤ i ≤ d, are the quantile functions of the margins. That is, the copula C can be thought of as the distribution function of the marginally transformed random vector (U1 , . . . , Ud ) = (F1 (X1 ), . . . , Fd (Xd )) with standard uniform marginal distributions (see, e.g., Joe, 1997). For example, a Gaussian copula is obtained via (1.1) from a multivariate standard normal distribution with correlation matrix R, and a t copula is obtained via (1.1) from a multivariate t distribution with ν degrees of freedom (d.f.) and dispersion matrix Σ. The Gaussian and t copulas inherit the dependence structure of multivariate normal and t distributions. ∗

Supported by NSERC Discovery Grant.



{akn,harry}@stat.ubc.ca, Department of Statistics, University of British Columbia, Vancouver, BC,

Canada V6T 1Z2. ‡

[email protected], Department of Mathematics, Washington State University, Pullman, WA 99164, USA.

1

One important dependence property is the tail dependence, which describes the dependence in the lower or upper tails of a distribution. In the bivariate case, the lower and upper tail dependence parameters are defined as follows (see Chapter 2 of Joe, 1997), λL ij = lim u↓0

Pr{Xi ≤ Fi−1 (u), Xj ≤ Fj−1 (u)} u

, λU ij = lim

Pr{Xi > Fi−1 (u), Xj > Fj−1 (u)}

u↑1

1−u

,

for any 1 ≤ i < j ≤ d. For (reflection) symmetric distributions such as the t distributions, U λij := λL ij = λij . If, furthermore, the univariate margins are identical, then we have λij = lim u↓0

Fij (t, t) Cij (u, u) = lim , t↓−∞ Fi (t) u

where Cij and Fij denote the (i, j) bivariate margin of C and F respectively. It is well-known that the Gaussian copula has no tail dependence (i.e., λ ij = 0 for any i 6= j), whereas the t copula enjoys the full spectrum of tail dependence (Embrechts et al., 2002), ! √ p ν + 1 1 − ρij p , (1.2) λij = 2Tν+1 − 1 + ρij

where Tm denotes the univariate t distribution function with m d.f., and ρ ij is the correlation coefficient of the (i, j) margin. That is, λ ij is increasing in ρij , from 0 to 1 as ρij goes from −1 to 1.

Let {λij : 1 ≤ i ≤ j ≤ d} be the set of bivariate (lower or upper) tail dependence parameters of a d-dimensional copula C. The study of the range of triplets (λ ij , λjk , λik ), for i, j, k distinct, is equivalent to the study of the range of λ ik given λij , λjk . A copula family with a wider range would have more versatility in modeling tail behavior. For t copulas, the inequalities for a triplet of tail dependence parameters are related to inequalities for a triple of correlation coefficients (ρij , ρjk , ρik ). We will refer to the parameters of dispersion matrix of the multivariate t distribution as correlations even though second moments exists only for ν > 2. Based on partial correlations (see, e.g., Kurowicka and Cooke, 2006), for elliptical copulas that include t copulas, the correlation coefficients ρik , 1 ≤ i < k ≤ d, satisfy −1 ≤ ρik;j ≤ 1 or q q ρij ρjk − (1 − ρ2ij )(1 − ρ2jk ) ≤ ρik ≤ ρij ρjk + (1 − ρ2ij )(1 − ρ2jk ) , i < j < k. (1.3) Inequalities for the bivariate tail dependence parameters have also been established (Theorem 3.14, Joe, 1997), max{0, λij + λjk − 1} ≤ λik ≤ 1 − |λij − λjk |, i < j < k.

(1.4)

Unlike (1.3), however, the sharpness of (1.4) had not been shown. In this paper, we establish the sharpness of the inequality (1.4) using the t copulas and min-stable trivariate exponential distributions. The sharpness of the bounds in (1.4) can be achieved in most cases through limiting t copulas, and this motivates us to study the extreme value t copulas (t-EV copulas), which preserve the tail dependence parameters of the t copula 2

(see Theorem 6.8 of Joe, 1997). It should be mentioned that Demarta and McNeil (2005) derived the EV copula limit for the bivariate t copulas using the standard techniques of Extreme Value Theory. To ease the cumbersome notational burden in higher dimensions, we introduce the tail dependence and conditional tail dependence functions to derive the t-EV copulas. This tool box of tail dependence functions, which is of interest in its own right, is particularly effective for the tail dependence analysis of multivariate t copulas. As two limiting cases, the t-EV copulas yield the H¨ usler-Reiss distribution and the Marshall-Olkin distribution as the d.f. ν goes to infinity and zero respectively. The remainder of the paper proceeds as follows. Section 2 introduces the multivariate tail dependence functions, and derives the t-EV copula and its limiting cases. Section 3 establishes the sharpness of the inequality (1.4). Also in Section 3, several distributions are compared in the ranges of triplets of tail dependence parameters. Section 4 concludes with some discussion and further research.

2

Extreme Value Limits of Multivariate t Copulas

Consider the d-dimensional copula C of a random vector (U 1 , . . . , Ud ) with the standard uniform margins and continuous second-order partial derivatives. Let I := {1, . . . , d} denote the index set. For any ∅ 6= S ⊆ I, let CS denote the copula of the |S|-dimensional margin (U i , i ∈ S). Define the lower tail dependence functions b S (·) of C, ∅ 6= S ⊆ I, as follows, bS (wi , i ∈ S) := lim u↓0

CS (uwi , i ∈ S) , ∀w = (w1 , . . . , wd ) ∈ Rd+ . u

(2.1)

Note that bS (w) = w if |S| = 1 for univariate margins, and b ij (1, 1) = λL ij becomes the lower tail dependence parameter of the (i, j) margin. Analogous upper tail dependence functions could be defined, but they are the same for t copulas because of their reflection symmetry. The following elementary properties of the function b S (·), ∅ 6= S ⊆ I, can be easily verified. 1. bS (wi , i ∈ S) = 0 if least one wi is zero, i ∈ S. 2. bS (wi , i ∈ S) is non-decreasing (or increasing) in w i ∈ S. 3. For any fixed t > 0, bS (twi , i ∈ S) = limu↓0 CS (utwi , i ∈ S)/u = t limu↓0 CS (utwi , i ∈ S)/(ut) = tbS (wi , i ∈ S), and thus bS (·) is homogeneous of order 1. Since b(w) := bI (w) is differentiable and homogeneous of order 1, the well-known Euler’s homogeneous theorem implies that (see, e.g., Wilson, 1912) b(w) =

d X ∂b wj , ∀w = (w1 , . . . , wd ) ∈ Rd+ , ∂wj j=1

3

(2.2)

where the partial derivatives ∂b/∂w j are homogeneous of order 0 and bounded. For any 1 ≤ j ≤ d, let Ij := I − {j}. Observe from (2.1) that for any 1 ≤ j ≤ d, ∂b ∂wj

= lim Pr{U1 ≤ uw1 , . . . , Uj−1 ≤ uwj−1 , Uj+1 ≤ uwj+1 , . . . , Ud ≤ uwd | Uj = uwj } u↓0

= lim Pr{Ui ≤ uwi , ∀i ∈ Ij | Uj = uwj } = lim CIj |{j} (uwi , i ∈ Ij | uwj ), u↓0

u↓0

(2.3)

where the notation CS1 |S2 refers to the conditional distribution of {U i : i ∈ S1 } given {Uk : k ∈ S2 }. The partial derivatives ∂b/∂wj are called the conditional lower tail dependence functions of copula C. To express the EV copula of C in terms of b S (·), we need the following function, X a(w) := (−1)|S|−1 bS (wi , i ∈ S).

(2.4)

S⊆I,S6=∅

It follows from (2.1) and the inclusion-exclusion relation that a(w) = lim u↓0

1 u

X

(−1)|S|−1 CS (uwi , i ∈ S)

S⊆I,S6=∅

 1 1 Pr{Ui ≤ uwi , for some i ∈ I} = lim 1 − Pr{Ui > uwi , ∀i ∈ I} u↓0 u u↓0 u  1 = lim 1 − Pr{1 − Ui ≤ 1 − uwi , ∀i ∈ I} , u↓0 u = lim

which implies the approximation

Pr{1 − Ui ≤ 1 − uwi , ∀i ∈ I} ≈ 1 − u a(w),

u → 0.

Let C ∗ denote the copula of (1 − U1 , . . . , 1 − Ud ). The lower EV limit of C is the same as the 1/n 1/n upper EV limit of C ∗ , which is given by limn→∞ (C ∗ )n (u1 , . . . , ud ) (Chapter 6, Joe, 1997). 1/n For sufficiently large n, uj = exp{n−1 log uj } ≈ 1 + n−1 log uj , so that with u ˜j = − log uj ,

as n → ∞.

 n 1/n 1/n u1 , . . . , u ˜d ) → exp{−a(˜ u1 , . . . , u ˜d )}, (C ∗ )n (u1 , . . . , ud ) ≈ 1 − n−1 a(˜

Theorem 2.1. Let C EV denote the lower EV copula of a copula C, then C EV (u1 , . . . , ud ) = exp{−a(− log u1 , . . . , − log ud )}, where a(·) can be evaluated from (2.1)-(2.4). Next we apply the above for the multivariate t copulas. Let X = (X 1 , . . . , Xd ) have the t distribution with ν d.f. and dispersion matrix Σ. Since the increasing location-scale marginal transforms convert X to the t distribution with identical margins, we assume, for expositional simplicity, that the margins Fi = Tν for all 1 ≤ i ≤ d, where Tν is the t distribution function 4

with ν d.f., and that Σ = (ρij ) satisfies ρii = 1 for all 1 ≤ i ≤ d. The conditional tail dependence function in (2.3) can be rephrased as ∂b = lim Pr{Xi ≤ Tν−1 (uwi ), ∀i ∈ Ij | Xj = Tν−1 (uwj )}, 1 ≤ j ≤ d. u↓0 ∂wj Consider z = Tν−1 (uw). Since the lower tail of the density T ν0 (z) behaves like αν ν|z|−(ν+1) where  (ν−1)/2 Γ (ν + 1)/2 √ αν = ν , Γ(ν/2) πν the distribution Tν (z) behaves like αν |z|−ν as z → −∞. Hence Tν−1 (u) ≈ −(αν /u)1/ν and z ≈ −(uw/αν )−1/ν as u → 0. Thus, the conditional tail dependence function of (2.3) for the t copula is expressed as ∂b = lim Pr{Xi ≤ −(uwi /αν )−1/ν , ∀i ∈ Ij | Xj = −(uwj /αν )−1/ν }, 1 ≤ j ≤ d. u↓0 ∂wj

(2.5)

To obtain the conditional distribution in (2.5) from the multivariate t distribution, we need the following result (see, e.g., Kotz and Nadarajah, 2004). Let T d,ν,Σ denote the distribution function of a d-dimensional t distribution with d.f. ν and dispersion matrix Σ. Lemma 2.2. Let X = (X 1 , X 2 ) have the multivariate t distribution T d,ν,Σ with ! Σ11 Σ12 Σ= , Σ22;1 = Σ22 − Σ21 Σ−1 11 Σ12 , Σ21 Σ22 where X 1 and X 2 are k and d − k dimensional respectively. Then ! s ν+k x2;1 . Pr{X 2 ≤ x2 | X 1 = x1 } = Td−k,ν+k,Σ22;1 ν + x01 Σ−1 11 x1 where x2;1 = x2 − Σ21 Σ−1 11 x1 . Combining (2.5) and Lemma 2.2, we have √     √ ∂b = Td−1,ν+1,Σj ν + 1 − (w1 /wj )−1/ν + ρ1j , . . . , ν + 1 − (wj−1 /wj )−1/ν + ρj−1,j , ∂wj     √ √ ν + 1 − (wj+1 /wj )−1/ν + ρj+1,j , . . . , ν + 1 − (wd /wj )−1/ν + ρdj √    = Td−1,ν+1,Σj ν + 1 − (wi /wj )−1/ν + ρij , i ∈ Ij   ! √  −1/ν ν + 1 w i + ρij , i ∈ Ij  , (2.6) = Td−1,ν+1,Rj  q − w 2 j 1 − ρij

where

1 − ρ21j  ..  .   ρ1,j−1 − ρ1j ρj−1,j Σj =  ρ1,j+1 − ρ1j ρj+1,j   ..   . ρ1d − ρ1j ρdj 

... .. . ... ... .. . ...

ρ1,j−1 − ρ1j ρj−1,j .. . 1 − ρ2j−1,j ρj−1,j+1 − ρj−1,j ρj+1,j .. . ρj−1,d − ρj−1,j ρdj

5

ρ1,j+1 − ρ1j ρj+1,j .. . ρj−1,j+1 − ρj−1,j ρj+1,j 1 − ρ2j+1,j .. . ρj+1,d − ρj+1,j ρdj

... .. . ... ... .. . ...

 ρ1d − ρ1j ρdj  ..  .   ρj−1,d − ρj−1,j ρdj   ρj+1,d − ρj+1,j ρdj    ..   . 1 − ρ2dj

and



with ρi,k;j =

1 .. .

... .. .

ρ1,j−1;j .. .

ρ1,j+1;j .. .

... .. .

   ρ 1 ρj−1,j+1;j . . .  1,j−1;j . . . Rj =  ρ1,j+1;j . . . ρj−1,j+1;j 1 ...  .. .. ..  .. .. .  . . . . ρ1,d;j . . . ρj−1,d;j ρj+1,d;j . . .

ρ −ρij ρkj q ik q 1−ρ2ij 1−ρ2kj

 ρ1,d;j ..  .   ρj−1,d;j    ρj+1,d;j   ..  .  1

(2.7)

, i 6= j, k 6= j, being the partial correlations. Substituting (2.6) in

(2.2) and (2.4), we obtain the tail dependence function and the EV limit of the t copula. Theorem 2.3. Let C be the copula of a multivariate t distribution with ν d.f. and dispersion matrix Σ = (ρij ) with ρii = 1 for 1 ≤ i ≤ d. 1. The tail dependence function of C is given by   # "   √ d −1/ν X wi ν+1 wj Td−1,ν+1,Rj  q + ρij , i ∈ Ij  , − b(w) = w 2 j 1 − ρij j=1 for all w = (w1 , . . . , wd ) ∈ Rd+ .

2. The t-EV copula, obtained by the EV limit of the t copula, is given by C EV (u1 , . . . , ud ) = exp{−a(w1 , . . . , wd )},

wj = − log uj , j = 1, . . . , d,

with exponent √ ν+1 a(w) = wj Td−1,ν+1,Rj  q 1 − ρ2ij j=1 d X





wi wj

−1/ν

#



− ρij , i ∈ Ij  .

Proof. We only need to derive (2.8). It follows from (2.2), (2.4), and (2.6) that X a(w) = (−1)|S|−1 bS (wi , i ∈ S)

(2.8)

S⊆I,S6=∅

  X ∂bS wj  = (−1)|S|−1  ∂wj j∈S S⊆I,S6=∅ ) (     X X uwj −1/ν uwi −1/ν |S|−1 (−1) , i ∈ S j Xj = − = lim wj Pr Xi ≤ − u↓0 αν αν j∈S S⊆I,S6=∅   (   −1/ν −1/ν ) d X X uw uw i j  (−1)|S|−2 Pr Xi ≤ − = lim wj 1 − , i ∈ S j Xj = − u↓0 αν αν X

j=1

S:j∈S,|S|≥2

6

where Sj = S − {j} ⊆ Ij . The inclusion-exclusion principle implies that   ! √  −1/ν d X wi ν+1 wj T d−1,ν+1,Rj  q a(w) = + ρij , i ∈ Ij  − w 2 j 1 − ρij j=1   ! √  −1/ν d X ν+1 wi wj Td−1,ν+1,Rj  q = − ρij , i ∈ Ij  , w 2 j 1−ρ j=1

ij

where T d−1,ν+1,Rj (·) is the survival function of the distribution function T d−1,ν+1,Rj .



It is known that the elliptical distribution functions are increasing in ρ ij in the sense of the concordance ordering; see, e.g., Theorem 2.21 of Joe (1997). The tail dependence function b(·) of a t copula and its t-EV copula inherit this monotonicity property. It is also evident from Theorem 2.3 that b(1, 1, . . . , 1) (respectively a(1, 1, . . . , 1)) are decreasing (respectively increasing) in d.f. ν. Note, however, that as our numerical results show, for any fixed w and ρ ij , the monotonicity of a(w) with respect to ν does not hold in general. Note that the t-EV copula family given in Theorem 2.3 have positive dependence only even if it is parametrized by a correlation matrix. The independence copula obtains only in the limit as ν → ∞.

Next we discuss the limits of (2.8) as ν → 0 and ν → ∞. Let X = (X 1 , . . . , Xd ) have the multivariate standard normal distribution with correlation matrix R = (ρ ij ). H¨ usler and Reiss (1989) derived the EV limit of X under the limiting constraint ρij (n) = 1 −

−2 4δij

log n

→ 1, as n → ∞,

(2.9)

for positive parameters δij , i 6= j, with δij = δji , and some other constraints. The H¨ usler-Reiss HR copula C can be rephrased as a min-stable multivariate exponential distribution as follows (see page 183 of Joe, 1997), for any non-negative w 1 , . . . , wd , C HR (e−w1 , . . . , e−wd ) = exp{−a(w; δij , 1 ≤ i, j ≤ d)}. The exponent function a(·) is given by a(w; δij , 1 ≤ i, j ≤ d) =

d X j=1

wj +

X

(−1)|S|−1 rS (wi , i ∈ S; (δij )i,j∈S ),

(2.10)

S⊆I,|S|≥2

where for S = {i1 , . . . , is } ⊆ I, |S| = s ≥ 2,   Z w is δij ,is y −1 log Φs−1,ΓS,is δij ,is + , 1 ≤ j ≤ s − 1 dy, rS (wi , i ∈ S; (δij )i,j∈S ) = 2 w ij 0

(2.11)

¯ s−1,Γ (·) is the survival function of the multivariate normal distribution with correlation and Φ S,j matrix ΓS,j = (%i,k;j ) whose (i, k) entry equal to, i, k ∈ S − {j}, %i,k;j :=

−2 −2 −2 δij + δkj − δik −1 −1 2δij δkj

7

,

(2.12)

with δii−1 is defined as zero for all i. As our next result shows, the H¨ usler-Reiss copula C HR emerges as the limiting distribution of the t-EV copula as ν → ∞ with ρ ij → 1 at appropriate rates. Theorem 2.4. Let CνEV (·) be the t-EV copula obtained in Theorem 2.3 with dispersion matrix −2 Σ = (ρij (ν)), where ρij (ν) = 1 − 2δij /ν and the parameters δij are the same as these in (2.9). EV Then Cν (·) converges weakly to the H¨ usler-Reiss copula C HR (·) as ν → ∞. EV (·) (C HR (·)) denote the |S|-dimensional copula of the margin Proof. For any ∅ 6= S ⊆ I, let CS,ν S of CνEV (·) (C HR (·)) with component indexes in S. Let b S,ν (·), ∅ 6= S ⊆ I, denote the tail dependence function of t-EV copula C νEV (·) as defined in (2.1).

Observe that rS (wi , i ∈ S; δij , i, j ∈ S) in (2.11) is homogeneous of order 1. It is clear from (2.4) and (2.10) that we only need to show that r S (·; δij , i, j ∈ S) of C HR (·) is the limit of the tail dependence function of CνEV (·); that is, for any ∅ 6= S ⊆ I, rS (wi , i ∈ S; (δij )i,j∈S ) = lim bS,ν (wi , i ∈ S), ∀w = (w1 , . . . , wd ) ∈ Rd+ . ν→∞

(2.13)

EV (·) as defined in (2.7), and For this, let RS,j , j ∈ S, denote the partial correlation matrix of C S,ν let ΓS,j denote the correlation matrix of CSHR (·) as defined in (2.12). It is easy to verify that for any 1 ≤ j ≤ d, −2 −2 −2 δij + δkj − δik ρik (ν) − ρij (ν)ρkj (ν) q q = %i,k;j as ν → ∞. → −1 −1 2δij δkj 1 − ρ2ij (ν) 1 − ρ2kj (ν)

Thus, RS,j converges entry-wise to ΓS,j . Let S = {i1 , . . . , is } ⊆ I, |S| = s ≥ 2. From (2.6) and (2.11), we have   # "  √ −1/ν ∂bS,ν ν+1 wi = Fs−1,ν+1,RS,is  q − + ρiis , i ∈ S − {is } ∂wis w 2 is 1 − ρiis   # " √ −1/ν w ij ν+1 − ρ ij is , 1 ≤ j ≤ s − 1  = F s−1,ν+1,RS,is  q 2 w i s 1−ρ ij is

∂ rS (wi , i ∈ S; (δij )i,j∈S ) = Φs−1,ΓS,is ∂wis

Observe that as ν → ∞,



δi−1 + j ,is

 δij ,is wi log s , 1 ≤ j ≤ s − 1 . 2 w ij

! " # !s √    2δi−2 wis 1/ν ν+1 ν2 wis 1/ν j ,is q − ρ ij is ≈ −1 + w ij w ij ν 4δi−2 1 − ρ2ij is j ,is   w s log wiis 2δi−2 δi ,is ν2 wi j ,is j  + ≈ → δi−1 + j log s . −2 j ,is ν ν 2 w ij 4δi ,is j

8

Therefore

∂bS,ν ∂ rS (wi , i ∈ S; {δij }i,j∈S ) = lim . ν→∞ ∂wis ∂wis

Since ∂bS,ν /∂wis is bounded, the bounded convergence theorem implies that Z w is ∂bS,ν dxis = bS,ν (wi , i ∈ S), rS (wi , i ∈ S; (δij )i,j∈S ) = lim ν→∞ 0 ∂xis and (2.13) holds.



Remark 2.5. Note that our method of tail dependence functions also yields a more explicit expression for the H¨ usler-Reiss copula. It follows from Theorems 2.3 and 2.4 that the exponent function for the H¨ usler-Reiss copula can be expressed as follows, a(w; δij , 1 ≤ i, j ≤ d) =

d X

wj Φd−1,ΓI,j

j=1



−1 δij

 wj δij log , i ∈ Ij . + 2 wi

This homogeneous representation provides better insight into understanding the structure of the H¨ usler-Reiss copula: The rates of changes for its exponent function are driven by normal distributions. H¨ usler and Reiss (1989) obtained this expression for the bivariate case:       w1 w2 δ δ −1 −1 + w2 Φ δ + log . a(w1 , w2 ) = w1 Φ δ + log 2 w2 2 w1 Their method, however, seems too cumbersome to extend this to higher dimensions in the form of (2.2). Our form is easier to use in higher dimensions; only a function for the multivariate normal cumulative distribution function is needed, and not another integral. As ν → 0, the rates of changes of the exponent function (2.8) of the t copula depends on the arguments wi , 1 ≤ i ≤ d, only through their ranks and thus the singularities appear. Let Fd,Σ denote the distribution function of a d-dimensional Cauchy distribution with dispersion matrix Σ. Since (wj /wi )1/ν goes to zero or infinity, as ν → 0, depending on whether w j < wi or wj > wi , the exponent (2.8) as ν → 0 becomes   d X −ρ ij wj Fd−r(wj ),RS (w);j  q a0 (w) = , i ∈ Sj (w) , (2.14) j 1 − ρ2ij j=1

where F0 := 1, r(wj ) denotes the rank of wj among w1 , . . . , wd , Sj (w) = {i : wi > wj }, and RSj (w);j is the sub-matrix of (2.7) by retaining the columns and rows with indexes in S j (w). Observe that the exponent a0 (w) is a linear function of wi , 1 ≤ i ≤ d, with the coefficient depending on its rank, and thus, as the next example shows, resembles the exponent functions of Marshall-Olkin distributions; see Marshall and Olkin (1967a,b). Example 2.6. Consider the trivariate case of a 0 (·). Let   −ρ −ρ kj ij  , if Sj (w) = {i, k}, ,q αik;j (ρij , ρkj ) = F2,RSj (w);j  q 2 1 − ρ2kj 1 − ρij 9



For w1 < w2 < w3 ,



−ρij  αi;j (ρij ) = F1,RSj (w);j  q if Sj (w) = {i}. 1 − ρ2ij a0 (w) = α23;1 (ρ21 , ρ31 )w1 + α3;2 (ρ32 )w2 + w3 ,

(2.15)

In contrast, the survival function of a trivariate Marshall-Olkin distribution with non-negative parameters {βS : ∅ 6= S ⊆ {1, 2, 3}} can be written as G

MO

(w1 , w2 , w3 ) = Pr{W1 > w1 , W2 > w2 , W3 > w3 } = exp{−aM O (w1 , w2 , w3 )},

where the exponent function is given by a

MO

(w1 , w2 , w3 ) =

3 X j=1

βj wj +

X

1≤i w1 , . . . , Wn > M O wd } = exp{−a (w)} where X aM O (w) = βS (∨i∈S wi ), (2.17) S:∅6=S⊆I

for some non-negative parameters β S ’s. Theorem 2.7. The multivariate t-EV copula with exponent (2.14) converges weakly as ν → 0 to a Marshall-Olkin distribution G M O with exponent of the form (2.17). Proof. We first introduce the notation, αSj (w);j (ρij , i ∈ Sj (w)) := Fd−r(wj ),RS

j (w);j





 q −ρij , i ∈ Sj (w) 1 − ρ2ij

with Sj (w) = {i : wi > wj }. For any w1 < w2 < · · · < wd , we have

a0 (w) = α2...d;1 (ρi1 , 2 ≤ i ≤ d)w1 + α3...d;2 (ρi2 , 3 ≤ i ≤ d)w2 + · · · + wd ! X aM O (w) = β1 w1 + (β2 + β12 )w2 + · · · + βS wd . S:d∈S

Thus, we need to find βS s such that β1 = α2...d;1 (ρi1 , 2 ≤ i ≤ d), β2 + β12 = α3...d;2 (ρi2 , 3 ≤ i ≤ d), . . . ,

X

βS = 1.

S:d∈S

By exchanging any two indexes, we obtain a system of equations for β S ’s: X βS = αSj (w);j (ρij , i ∈ Sj (w)), 1 ≤ j ≤ d,

(2.18)

S:j∈S,S⊆Sjc(w)

where Sjc (w) denote the complement of Sj (w). To construct non-negative solutions for (2.18), let (Z1j , . . . , Zdj ) be a random vector jointly having q the multivariate Cauchy distribution with correlation matrix (2.7), and also let γ ij := −ρij /

1 − ρ2ij , 1 ≤ i, j ≤ d and i 6= j. Then

αSj (w);j (ρij , i ∈ Sj (w)) = Pr{Zij ≤ γij , i ∈ Sj (w)}.

Partition the space Aj (w) := {(z1 , . . . , zd ) : ci ≤ γij , i ∈ Sj (w)} into the following nonoverlapping subsets: for any S with j ∈ S, S ⊆ S jc (w), AjS (w) := {(z1 , . . . , zd ) : zi ≤ γij , i ∈ Sj (w); zk ≤ γkj , k ∈ Sjc (w) − S; zl > γlj , l ∈ S − {j}}. It is easy see that Aj (w) = ∪S:j∈S,S⊆Sjc(w) AjS (w) and the AjS (w) are mutually exclusive. Therefore, X Pr{(Z1j , . . . , Zdj ) ∈ AjS (w)} = Pr{(Z1j , . . . , Zdj ) ∈ Aj (w)} = αSj (w);j (ρij , i ∈ Sj (w)). S:j∈S,S⊆Sjc(w)

11

Let, for any S such that j ∈ S and S ⊆ Sjc (w) (1 ≤ j ≤ d), βS = Pr{(Z1j , . . . , Zdj ) ∈ AjS (w)}.

(2.19)

These non-negative βS ’s satisfy (2.18). Hence the exponent a 0 (·) of the limiting t-EV copula as ν → 0 can be written as the exponent aM O (·) of a Marshall-Olkin distribution with parameters (2.19).  Remark 2.8. The explicit construction (2.19) can be used to evaluate β S ’s for the MarshallOlkin representation of the limiting t-EV copula. Note that each β S has |S| different, but equivalent expressions. For example, β 12 in Example 2.6 has the following two expressions: β12 = α3;2 (ρ32 ) − α13;2 (ρ12 , ρ32 ) = α3;1 (ρ31 ) − α23;1 (ρ21 , ρ31 ). The equivalence of these two expressions can be translated into the following structural symmetry for the trivariate standard Cauchy distribution with correlation matrix R: ! ! −ρ32 −ρ21 −ρ31 F1 p + F2,R{2,3};1 p ,p 1 − ρ232 1 − ρ221 1 − ρ231 ! ! −ρ12 −ρ32 −ρ31 + F2,R{1,3};2 p ,p , = F1 p 1 − ρ231 1 − ρ212 1 − ρ232

where F1 (·) denote the univariate margin. The other symmetrical relations can be similarly obtained. Our numerical results show that the multivariate t distributions with d.f. ν 6= 1 do not possess such structural symmetries. The limiting t-EV copula as ν → 0 takes especially a simple form in the bivariate case. Let Y i , −1 i = 1, 2, and Y12 be independent exponential random   variables with E(Y i ) = α , i = 1, 2, and p E(Y12 ) = (1 − α)−1 , where α = F1 −ρ/ 1 − ρ2 . Then the limiting distribution of the t-EV

copula C EV (u1 , u2 ) as ν → 0 is described by the joint distribution of W 1 = min{Y1 , Y12 }, W2 = min{Y2 , Y12 }; that is, lim C EV (e−w1 , e−w2 ) = exp{−α(w1 + w2 )} exp{−(1 − α)(w1 ∨ w2 )},

ν→0

for any non-negative w1 , w2 .

3

Sharpness of Compatibility Inequalities for Tail Dependence

Let λij := bij (1, 1) denote the lower tail dependence parameter of the (i, j) margin C ij of a d-dimensional copula C. To compare the range of tail dependence of t copulas (t-EV copulas) and the bounds in (1.4), we utilize the correlation inequality (1.3). It follows from (1.2) that for a bivariate t copula with ν d.f., correlation ρ and tail dependence parameter λ, we have ρ = (ν + 1 − α)/(ν + 1 + α), 12

2  −1 (λ/2) . α = Tν+1

(3.1)

Thus, the lower and upper bounds of ρ ik as a function of λij and λjk are given by B − and B + respectively, where B± =

−1 −1 −1 −1 (ν + 1 − [Tν+1 (λij /2)]2 ) (ν + 1 − [Tν+1 (λjk /2)]2 ) ± 4(ν + 1) Tν+1 (λij /2) Tν+1 (λjk /2) −1 −1 (ν + 1 + [Tν+1 (λij /2)]2 ) (ν + 1 + [Tν+1 (λjk /2)]2 )

.

Since the tail dependence parameter of the t copula an increasing function of its the  q  isq  correlation,

lower and upper bounds of λik are given by 2Tν+1 − respectively.

(ν+1)(1−B − ) (1+B − )

and 2Tν+1 −

(ν+1)(1−B + ) (1+B + )

As a limiting case of the t-EV copula when ν → ∞, the trivariate H¨ usler-Reiss copula with −2 −2 −2 −1 −1 parameters δij , δik , δjk , satisfying that −1 ≤ %ik;j ≤ 1, where %ik;j = (δij + δjk − δik )/(2δij δjk ) −1 −1 −1 −1 (see (2.12)). Thus, the bounds for δ ik are |δij ± δjk |. Since λik = 1 − 2Φ(δik ) the bounds for  λik are given by 2 − 2Φ |Φ−1 (1 − λij /2) ± Φ−1 (1 − λjk /2)| , where Φ(·) is the standard normal distribution. One can see that the upper bounds of both t and H¨ usler-Reiss copula are equal to the upper bound of inequality (1.4), when λij = λjk . Table 1 contains representative results of the bounds of λik from both copulas and the bounds of inequality (1.4) for various values of λ ij , λjk and ν. From the table one can clearly see that the bounds of the t copula are closer than the H¨ uslerReiss copula to the bounds of inequality (1.4). In fact when ν → 0 they are identical, except the case for λik + λjk < 1 where the λikL becomes greater than zero. Note here that (1.2) and (3.1) are still well defined for ν = 0.

λij 0.1 0.2 0.1 0.2 0.3 0.4

λjk 0.9 0.8 0.4 0.6 0.9 0.8

H¨ usler-Reiss λikL λikU 0.077 0.129 0.125 0.304 0.013 0.422 0.071 0.449 0.245 0.362 0.274 0.556

t, ν = 5 λikL λikU 0.073 0.133 0.119 0.312 0.007 0.453 0.061 0.464 0.242 0.365 0.268 0.560

t, ν = 2 λikL λikU 0.068 0.139 0.109 0.322 0.001 0.493 0.047 0.483 0.238 0.370 0.262 0.566

t, ν → 0 λikL λikU 0 0.2 0 0.4 0.5 0.7 0.2 0.6 0.2 0.4 0.2 0.6

Bounds λikL 0 0 0 0 0.2 0.2

in (1.4) λikU 0.2 0.4 0.7 0.6 0.4 0.6

Table 1: Bounds (λikL , λikU ) of the tail dependence parameter λ ik given λij and λjk for H¨ uslerReiss and t copulas for various d.f. ν, along with bounds obtained by inequality (1.4). Consider the limit of tail dependence as ν → 0. It follows from the Cauchy distribution T1 (x) = π −1 arctan(x) + 0.5 that (1.2) becomes ! r  r  1−ρ 1+ρ −1 −1 λ = 1 + 2π arctan − = 1 − 2π arccos = 1 − π −1 arccos(ρ), 1+ρ 2 h  i2  and thus ρ = 2 cos π(1 − λ)/2 − 1 = cos π(1 − λ) .

Given λij , λjkqin (0, 1), qthen ρij = cos(π(1 − λij )), ρjk = cos(π(1 − λjk )), implying that 2 ρikU = ρij ρjk + 1 − ρij 1 − ρ2jk = cos(π(λjk − λij )). Then the maximum tail dependence 13

possible for λik is λikU = 1 − π −1 arccos(ρikU ) = 1 − |λjk − λij |. q q Similarly, ρikL = ρij ρjk − 1 − ρ2ij 1 − ρ2jk = cos(π(λij + λjk )). Then the minimum tail dependence possible for λik is ( 1 − λij − λjk if λij + λjk < 1, λikL = 1 − π −1 arccos(ρikL ) = 1 − (2 − λij − λjk ) = λij + λjk − 1 if λij + λjk ≥ 1. The latter case comes from π(λij + λjk ) ≥ π. Therefore, the upper bound of (1.4) is always reached as ν → 0 and the lower bound of (1.4) can be reached as ν → 0 if λ ij + λjk ≥ 1. In the remaining case of λij + λjk < 1,

λikL (ν) = 2Tν+1



s

(ν + 1)(1 − ρikL ) (1 + ρikL )

!

(3.2)

is not decreasing at ν → 0; see Figure 1. Also note that for (1.2), ρ > 0 implies λ > √ 2Tν+1 (− ν + 1 ), which approaches 2T1 (−1) = 0.25 as ν → 0. For larger ν, ρ > 0 implies λ > (ν) where (ν) is near 0. For ν → 0, if λ ij , λjk are small, then ρij , ρjk are negative, but ρikL can be quite positive. Hence 1 − λij − λjk is not decreasing as λij , λjk get smaller. However, Figure 1 shows that λikL (ν) in (3.2) comes close to zero for some ν > 0. In order to show the sharpness of inequality (1.4) for the case of λ ij + λjk < 1, we use the Marshall-Olkin distribution, or more generally, certain multivariate min-stable exponential distributions. Example 3.1. Consider a trivariate Marshall-Olkin distribution with exponent (2.17) and the following parameters, β1 = 3/4, β2 = 1/2, β3 = 1/4, β12 = 0, β13 = 1/4, β23 = 1/2, β123 = 0. It is easy to verify that all the univariate margins are the standard exponential distribution. Using the formulas of tail dependence parameters for Marshall-Olkin distributions (see, e.g., Li, 2007), we have λ12 = 0, λ13 = 1/4, λ23 = 1/2. Since λ13 +λ23 < 1, the lower bound of (1.4) for λ12 is achieved. Notice that this Marshall-Olkin distribution cannot be used to represent the limit of the t-EV copula as d.f. ν → 0, because β1 > β2 + β12 , which violates the parameter constraints imposed by (2.18) (see also Example 2.6).  Example 3.2. Let Zi , Zj , Zk be independent standard exponential random variables. αi , αj , βi , βj , βk be positive constants such that αi + αj = 1 and βi + βj + βk = 1. Let Xi = min{Zi /αi , Zj /αj },

Xj = min{Zi /βi , Zj /βj , Zk /βk }, 14

X k = Zk .

Let

0.8 0.4 0.0

0.2

λikL

0.6

λ = 0.1 λ = 0.2 λ = 0.3 λ = 0.4

0

1

2

3

4

5

ν

Figure 1: Lower bound (λikL ) of the tail dependence parameter λ ik for the t copula as a function of ν for various values of λ = λij = λjk (λij + λjk < 1). The minimum of each curve is positive but very close to 0.

Then (Xi , Xj , Xk ) has jointly a multivariate min-stable exponential distribution with standard exponential margins. Also Xi , Xk are pairwise independent, but Xi , Xj and Xj , Xk are pairwise dependent. Hence λij > 0, λjk > 0 and λik = 0. The bivariate survival functions are given by Gij (xi , xj ) = exp{−(αi xi ∨ βi xj + αj xi ∨ βj xj + βk xj )} =: exp{−aij (xi , xj )}

Gjk (xj , xk ) = exp{−(βi xj + βj xj + xk )} =: exp{−ajk (xj , xk )}, so that λij

= 2 − aij (1, 1) = 2 − αi ∨ βi − αj ∨ βj − βk

λjk = 2 − ajk (1, 1) = 2 − βi − βj − 1 = βk .

(3.3) (3.4)

Assume, without loss of generality, that α i < βi , αj > βj . From (3.3) and (3.4) we have that (

λij = 1 + βj − αj λjk = βk

⇔ αj = βj + 1 − λij ⇔ βi = 1 − λjk − βj

15

(3.5)

Thus we obtain the family of solutions in (3.5) if    0 < αj < 1  0 < βi < 1 ⇔ βj ∈ 0, min{λij , 1 − λjk } .   0 < βj < 1

 If we select βj ∈ 0, min{λij , 1 − λjk } , then we have a family of distributions with λ ij + λjk < 1 but λik = 0. The lower bound of (1.4) is achieved for this family of distributions. 

4

Concluding Remarks

In this paper the sharpness of inequality among three tail dependence parameters (of a trivariate margin) is proved. The key is the t copula as ν → 0 which shows that the trivariate t copulas cover a wide range of tail dependence. This motivated the derivation of the limiting extreme value distribution for the d-variate t copula with ν d.f.; this limit involves the (d−1)-dimensional t distribution with ν + 1 d.f. Software exists for the numerical computations of the multivariate t distribution function, for example, Genz and Bretz (2002) and the mvtnorm package in R. A conjecture is that for d ≥ 4, the set of tail dependence parameters {λ jk : 1 ≤ j < k ≤ d} for the t and the related copulas might reach the full range. Sharpness results in dimensions d ≥ 4, for sets of 6 or more bivariate tail dependence parameters, would be hard to try to establish, but based on the sharpness for d = 3 (or sets of 3 tail dependence parameters), the range of {λjk : 1 ≤ j < k ≤ d} for d-variate t copulas can be compared against if one wants to check if another particular copula family has a wide and flexible range of tail dependence.

References Demarta, S. and McNeil, A. J. (2005). The t copula and related copulas. International Statistical Review, 73:111–129. Embrechts, P., McNeil, A. J., and Straumann, D. (2002). Correlation and dependency in risk management: properties and pitfalls. In Risk Management: Value at Risk and Beyond (ed. M. Dempster), pp. 176–223. Cambridge University Press, Cambridge, UK. Genz, A. and Bretz, F. (2002), Methods for the computation of multivariate t-probabilities. Journal of Computational and Graphical Statistics, 11:950–971. H¨ usler, J. and Reiss, R.-D. (1989). Maxima of normal random vectors: between independence and complete dependence. Statistics & Probability Letters, 7(4):283–286. Joe, H. (1997). Multivariate Models and Dependence Concepts. Chapman & Hall, London. Kotz, S. and Nadarajah, S. (2004). Multivariate t Distributions and Their Applications. Cambridge University Press, Cambridge, UK. 16

Kurowicka, D. and Cooke, R. (2006). Uncertainty Analysis with High Dimensional Dependence Modelling. Wiley Series in Probability and Statistics. Li, H. (2007). Tail dependence comparison of survival Marshall-Olkin copulas. To appear in Methodology and Computing in Applied Probability. Marshall, A. and Olkin, I. (1967a). A generalized bivariate exponential distribution. Journal of Applied Probability, 4:291–302. Marshall, A. and Olkin, I. (1967b). A multivariate exponential distribution. Journal of the American Statistical Association, 62:30–44. McNeil, A. J., Frey, R., Embrechts, P. (2005) Quantitative Risk Management: Concepts, Techniques, and Tools. Princeton University Press, Princeton, New Jersey. Wilson, E. B. (1912). Advanced Calculus. Ginn and Company, Boston.

17