Influence diagnostics for robust P-splines using scale ... - Felipe Osorio

0 downloads 0 Views 2MB Size Report
maximum penalized likelihood estimation under the class of scale mixture .... estimators of penalized maximum likelihood (PML) is developed and presents.
Annals of the Institute of Statistical Mathematics manuscript No. (will be inserted by the editor)

Influence diagnostics for robust P-splines using scale mixture of normal distributions Felipe Osorio

Received: date / Accepted: date

Abstract It has been well documented that the presence of outliers and/or extreme data can strongly affect smoothing via splines. This work proposes an alternative for accommodating outliers in penalized splines considering the maximum penalized likelihood estimation under the class of scale mixture of normal distributions. This family of distributions has been an interesting alternative to produce robust estimates, keeping the elegancy and simplicity of the maximum likelihood theory. The aim of this paper is to apply a variant of the EM algorithm for computing efficiently the penalized maximum likelihood estimates in the context of penalized splines. In order to highlight some aspects of the robustness of the proposed penalized estimators we consider the assessment of influential observations through case deletion and local influence methods. Numerical experiments were carried out to illustrate the good performance of the proposed technique. Keywords Cook distance · Local influence · Penalized EM algorithm · Scale mixtures of normal distributions

1 Introduction Regression methods using splines are very attractive because they represent a flexible approach to fitting curves and are often used to find the underlying tendencies in the data. Discussions about smoothing and nonparametric regression can be found, for example, in Silverman (1985), Eilers and Marx (1996) and Ruppert et al. (2003). The online version of this article contains supplementary material. Felipe Osorio Departamento de Matem´ atica, Universidad T´ ecnica Federico Santa Mar´ıa, Av. Espa˜ na 1680, Valpara´ıso, Chile E-mail: [email protected]

2

Felipe Osorio

The development of robust methodologies with the objective of attenuating the effect of outliers and/or influential observations in semiparametric regression has received considerable attention since the seminal works of Huber (1979) and Utreras (1981). They introduced the robust smoothing considering M -estimators. Their ideas have since been refined and applied to more general contexts. For example, Koenker et al. (1994) proposed the quantile smoothing splines, Oh et al. (2004) proposed M -estimation for smoothing periodic functions, while Lee and Oh (2007) and Oh et al. (2008) developed robust M type estimation procedures with applications to additive mixed models and local polynomial regression, respectively. More recently, Tharmaratnam et al. (2010) and Mateos and Giannakis (2012) have discussed very efficient methods for computing penalized S - and M -type regression splines estimators, respectively. The appropriate selection of the smoothing parameter is crucial in the class of penalized spline regression models. It is important to note that this can also be strongly affected by the presence of outlying observations. To avoid this type of difficulty, Cantoni and Ronchetti (2001), Wei (2005) and Lee and Cox (2009) have focused on developing methods of robust selection of the smoothing parameter, while Staudenmayer et al. (2009) and Ibacache-Pulgar and Paula (2011) have described approaches for accommodating outliers in semiparametric regressions considering Student-t errors. With the objective of evaluating the model assumptions and determining whether outlying or extreme observations can influence the parameter estimates, diagnostic procedures have been developed in the context of semiparametric regressions (Eubank 1985). For example, Eubank (1984) studied the properties of the prediction matrix for smoothing splines, while Silverman (1985) discussed definitions for the residuals. Eubank and Gunst (1986) proposed measures to evaluate influence in penalized least squares that are useful in contexts like smoothing spline and ridge regression (Hoerl and Kennard 1970). Studies of influence in the context of ridge regression suggest that this class of penalized estimators can be very sensitive to extreme observations (Walker and Birch 1988; Billor and Loynes 1999; Shi and Wang 1999). Thomas (1991) studied influence to evaluate the impact of extreme observations on the selection of the smoothing parameter considering the local influence procedure proposed by Cook (1986). Manchester (1996) proposed a graphic tool to evaluate the sensitivity of some robust smoothing methods through the use of the influence function. Kim (1996) and Wei (2004) developed diagnostic measures in smoothing splines based on case deletion procedures. Kim et al. (2002) and Ibacache-Pulgar and Paula (2011) discussed influence diagnostics using elimination of observations and local influence, respectively, in partially linear models. This work proposes an alternative to accommodate outliers in penalized splines, also known as P-splines (see Eilers and Marx 1996), considering distributions with heavier tails than the normal. Specifically, we considered the class of scale mixtures of normal distributions (SMN), which includes as particular cases exponential power, contaminated normal, slash and Student-t distributions, among others (Andrews and Mallows 1974). SMN distributions

Influence diagnostics for robust P-splines

3

have often been proposed for developing robust inferences in various statistical models. This class of distribution inherits many of the basic properties of the normal distribution and allows for maintaining the elegance and optimality of estimating parameters considering the maximum likelihood method (ML) under normality (Lange and Sinsheimer 1993; Jamshidian 1999). In this work we apply a penalized EM algorithm (Green 1990) to estimation in Psplines. An interesting characteristic of the proposed procedure is that the estimator of the coefficients adopts the form of a weighted smoother. The estimation procedure proposed in this work has been implemented in the heavy package (Osorio 2014), developed as an extension of the R statistical software (R Core Team 2014). The package is available from CRAN and the website http://heavy.mat.utfsm.cl. We studied influence diagnostics to determine the robustness of the proposed procedure against outlying observations and some common perturbation schemes considering the approaches of case deletion and local influence for models with incomplete data as described in Zhu and Lee (2001) and Zhu et al. (2001). This article is organized as follows: Section 2 introduces P-splines considering heavy-tailed distributions, a variant of the EM algorithm to obtain the estimators of penalized maximum likelihood (PML) is developed and presents the optimal selection of the smoothing parameter using a weighted version of the generalized cross validation criterion (GCV). Section 3 describes the main results associated with the influence diagnostics by case deletion and local influence for models with incomplete data and presents the generalized Cook distance and the normal curvature under several perturbation schemes of the proposed model. The methodology is applied in Section 4 to the dataset of life expectancy in 101 countries (Leinhardt and Wasserman 1979) assuming distributions with heavier tails than the normal, also some simulation results are discussed. The numerical experiments show the utility of the proposed methodology. In Section 5 we present some final considerations.

2 P-splines under heavy-tailed distributions In this section we propose an alternative to accommodating extreme and outlying observations in penalized splines based on distributions with heavier tails than the normal. We also describe the PML estimation for P-splines using a penalized EM algorithm and present the selection of the smoothing parameter through a weighted version of the GCV criterion. The estimation of the shape parameters of the mixture variable is also described.

2.1 Estimation in P-splines using the penalized EM algorithm Consider the model, Yi = g(xi ) + i ,

i = 1, . . . , n,

(1)

4

Felipe Osorio

where the responses Yi are observed at design points xi and g is a smooth function defined in [a, b]. It is assumed that the design points are such that a ≤ x1 < · · · < xn ≤ b and the {i } are random variables Ppwith zero position and scale φ > 0. For simplicity, we consider that g(x) = j=1 aj Bj (x), where p is the number of known basis functions B1 (x), . . . , Bp (x). B-splines are a common choice for the basis functions. The class of SMN distributions (Andrews and Mallows 1974) represents an interesting alternative to the normal distribution in the presence of extreme observations and has been applied very successfully for statistical modeling by a number of authors, among them Butler et al. (1990), Lange and Sinsheimer (1993) and Jamshidian (1999). A random variable Y is said to follow an SMN distribution (Andrews and Mallows 1974) with position parameter µ ∈ R and d

scale φ > 0 if it can be written as Y = µ + τ −1/2 Z, where Z ∼ N (0, φ) and τ is a positive random variable with distribution function H(τ ; ν) where ν represents a scalar or vector valued parameter that control the shape of the distribution. The density function of Y is given by Z ∞ −1/2 (2) τ 1/2 exp(− 21 τ D2 ) dH(τ ), f (y) = (2πφ) 0

where D2 = (y−µ)2 /φ represents the distance between y to the center µ scaled by φ. When Y has a density given by (2) we will denote Y ∼ SMN (µ, φ; H). It is convenient to write the distribution of the random variable Y alternatively using the following hierarchical representation: Y |τ ∼ N (µ, φ/τ ),

τ ∼ H(ν).

(3)

The formulation given in (3) is useful, for example for random number generation and parameters estimation using missing data formulation through the EM algorithm (Dempster et al. 1977). In this work the Student-t and slash distributions are considered to illustrate the proposed methodology. In fact, the Student-t distribution can be written using the representation in (3) considering that τ ∼ Gamma(ν/2, ν/2) and we write Y ∼ t(µ, φ; ν), ν > 0, while that for the slash distribution, denoted by Y ∼ Slash(µ, φ; ν), ν > 0, we have τ ∼ Beta(ν, 1), ν > 0. For both, the Student-t and slash distributions, ν represents the degrees of freedom and this parameter control the kurtosis of the distribution. It is interesting to note that when ν → ∞ the normal distribution is recovered. It should be emphasized that other distributions also can be considered, such as the contaminated normal (Little 1988) and the Laplace or double exponential (Phillips 2002). We will introduce scale mixtures of normal distributions for the model given in (1), by considering the following distributional assumption ind

Yi ∼ SMN (b> i a, φ; H),

i = 1, . . . , n,

(4)

where B = (b1 , . . . , bn )> = (Bj (xi )) is a n × p matrix and a = (a1 , . . . , ap )> . Thus, P-splines considering heavy-tailed distributions can be introduced by

Influence diagnostics for robust P-splines

5

obtaining the PML estimates in the following penalized problem, Z b λ {g 00 (x)}2 dx `λ (θ; Y obs ) = `o (θ; Y obs ) − 2φ a λ > > = `o (θ; Y obs ) − a K Ka, 2φ

(5)

where θ = (a> , φ)> , λ > 0 represents a smoothing parameter and K > K is the matrix representation of the penalty described in Eilers and Marx (1996) and K is the k × p matrix (k ≤ p) of the kth order of differencing. Details about the construction of the difference matrix K, are discussed in Eilers and Marx (1996, 2010). The log-likelihood function for the class of SMN distributions given in (5) assumes the form Z ∞ n X  n 1/2 log τi exp − 12 τi Di2 (θ) dH(τi ), `o (θ; Y obs ) = − log 2πφ − 2 0 i=1 2 where Di2 (θ) = (Yi − b> i a) /φ, for i = 1, . . . , n. The estimation problem given in (5) can be significantly simplified by considering an incomplete data formulation. Using the hierarchical formulation of a random variable with an SMN distribution, it is possible to re-write the model proposed in (4) as ind

ind

Yi |τi ∼ N (b> i a, φ/τi ),

τi ∼ H(ν),

i = 1, . . . , n.

Thus, it is possible to apply the penalized EM algorithm (Green 1990) to estimate the parameters in (5) by assuming that τ = (τ1 , . . . , τn )> are missing variables. The penalized log-likelihood function for the model based in complete data Y com = (Y > , τ > )> is defined through `λ (θ; Y com ) = `c (θ; Y com ) −

λ > > a K Ka, 2φ

with n

`c (θ; Y com ) = − =−

n 1 X 2 (n) log φ − τi (Yi − b> (τ ; ν) i a) + log h 2 2φ i=1 n 1 log φ − (Y − Ba)> W (Y − Ba) + log h(n) (τ ; ν), 2 2φ

where h(n) (τ ; ν) is the joint density function of the mixture variables τ = (τ1 , . . . , τn )> and W = diag(τ1 , . . . , τn ). Assuming that ν is known, it is possible to show that the conditional expectation of the complete data penalized log-likelihood function considering a current estimate θ (k) , given by Qλ (θ|θ (k) ) = E[`c (θ; Y com )|Y , θ (k) ] −

λ > > a K Ka, 2φ

6

Felipe Osorio

can be expressed as Qλ (θ|θ (k) ) = −

n 1 log φ − [(Y − Ba)> W (k) (Y − Ba) + λa> K > Ka], (6) 2 2φ (k)

(k)

(k)

where W (k) = diag(τ1 , . . . , τn ) with τi = E(τi |Yi , θ (k) ). In general, it is (k) possible to show that the weight function, defined by the expectation τi is given by R ∞ 3/2 τi exp(− 12 τi Di2 (θ)) dH(τi ) (k) 0 . E(τi |Yi , θ ) = R ∞ 1/2 τi exp(− 12 τi Di2 (θ)) dH(τi ) θ=θ(k) 0 Note that for most of the distributions in the SMN class, the weight function (k) τi can be easily computed (see Lange and Sinsheimer 1993). For the Studentt distribution is well known that (k)

τi

= (ν + 1)/(ν + Di2 (θ (k) )), (k)

while for the slash distribution, τi (k)

τi

=

(7)

assumes the form (Jamshidian 1999)

 2ν + 1  P (ν + 3 , D2 (θ (k) )/2) i 2 Di2 (θ (k) ) P (ν + 12 , Di2 (θ (k) )/2)

,

(8)

where P (α, z) is the incomplete gamma function of parameter α at z (Abramowitz and Stegun 1970, p. 260), defined as Z z 1 e−t tα−1 dt. P (α, z) = Γ (α) 0 To maximize Qλ (θ|θ (k) ) given in (6) with respect to θ = (a> , φ)> , we solve the first order condition and update θ (k+1) as (k+1)



(k+1)

φλ

= (B > W (k) B + λK > K)−1 B > W (k) Y , 1 (k+1) (k+1) 2 = {RSSW (k) (aλ ) + λkKaλ k }, n

(9) (10)

where RSSW (a) = (Y − Ba)> W (Y − Ba). The PML estimates for the problem in (5) are obtained by iterating the E and M steps of the algorithm, described in equations (6) and (9)-(10) until reaching convergence. It is possible to modify the estimation procedure delineated above to simultaneously estimate a, φ and the tuning parameter ν. In this case, the expectation of the complete data penalized log-likelihood function assumes the form Qλ (θ|θ (k) ) = Qλ (a, φ|θ (k) ) + Qλ (ν|θ (k) ), (11) with θ = (a> , φ, ν > )> , where Qλ (a, φ|θ (k) ) is given in equation (6), while Qλ (ν|θ (k) ) = E[log h(n) (τ ; ν)|Y , θ (k) ].

(12)

Influence diagnostics for robust P-splines

7

For the M step we update a(k+1) and φ(k+1) according to equations (9) and (10), respectively and we obtain ν (k+1) through ν (k+1) = arg max Qλ (ν|θ (k) ).

(13)

ν

Indeed ν (k+1) can be obtained as the solution of the system of equations ∂Qλ (ν|θ (k) )/∂ν = 0. Hence, the form that conditional expectation (12) and the M step (13) adopt depends on the specific choice of the distribution of the random variable τi . In order to illustrate the estimation of the tuning parameter ν, below we present the estimation of the degrees of freedom for the slash and Student-t distributions. Additional details about this respect can be found in Lange and Sinsheimer (1993), McLachlan and Krishnan (1997) and Jamshidian (1999). ind

For the particular case of the Student-t distribution, it follows that τi |Yi ∼ Gamma((ν +1)/2, (ν +Di2 (θ))/2), for i = 1, . . . , n. Using results from McLachlan and Krishnan (1997) we have (k)

E(log τi |Yi , θ (k) ) = log τi

 ν (k) + 1 o n  ν (k) + 1  − log , + ψ 2 2

(k)

where τi is defined in (7) and ψ(z) = d log Γ (z)/ dz is the digamma function (Abramowitz and Stegun 1970, p. 268). Thus, the conditional expectation of the complete data penalized log-likelihood associated with ν, assumes the form n ν   ν  nν n 1 X nν (k) (k) log − n log Γ + (log(τi ) − τi ) 2 2 2 2 n i=1  ν (k) + 1   ν (k) + 1 o +ψ − log . 2 2

Qλ (ν|θ (k) ) =

It is possible to update ν (k+1) as the solution to equation ∂Qλ (ν|θ (k) )/∂ν = 0 using an one-dimensional Newton-Raphson method. Using errors following a slash distribution, the calculation of the conditional expectation in (12), requires evaluating (see Lange and Sinsheimer 1993) R1 ν−1/2 log(τi )τi exp(− 21 τi Di2 (θ (k) )) dτi (k) E(log τi |Yi , θ ) = 0 R 1 ν−1/2 τ exp(− 21 τi Di2 (θ (k) )) dτi 0 i = ψ(ν + 12 ) − log(Di2 (θ (k) )/2) +

∂P (ν + 21 , Di2 (θ (k) )/2)/∂ν P (ν + 12 , Di2 (θ (k) )/2)

.

The derivative of the incomplete gamma function ∂P (a, x)/∂a can be evaluated using the algorithm described in Moore (1982). In this case, the conditional expectation in (12) is given by Qλ (ν|θ (k) ) = n log ν + ν

n X i=1

E(log τi |Yi , θ (k) ).

8

Felipe Osorio

Maximizing Qλ (ν|θ (k) ) in relation to ν, we obtain n . ν (k+1) = − Pn (k) ) i=1 E(log τi |Yi , θ Remark 1 In this work, we address the estimation of the shape parameters for the mixture variables using the EM algorithm following the approach proposed for a number of authors in settings like nonlinear regression models (Lange and Sinsheimer 1993; Jamshidian 1999) and linear mixed-effects models under Student-t errors (Pinheiro et al. 2001; Lin and Lee 2006). Although the approach of these works has been quite successful in practice, some authors (see, for instance Lucas 1997; Fern´ andez and Steel 1999) have warned about potential problems that may arise in the estimation of degrees of freedom for the Student-t distribution. Particularly, Lucas (1997) has pointed out by using influence functions for the univariate case that the protection against outliers is only attained when this parameter is kept fixed. Moreover, when the degrees of freedom is estimated by maximum likelihood the influence function for the scale, degrees of freedom and the change-of-variance of the position parameter is unbounded. Thus, one alternative is to assume that the parameters associated with the mixture variables τi are known. In order to achieve protection against outliers Lange et al. (1989) suggest that the degrees of freedom of the Student-t distribution must kept fixed in a small reasonable value such as ν = 4. There is an option in the heavy package that allow one to keep the shape parameter ν fixed. 2.2 Smoothing parameter selection Several authors have suggested modifications to the GCV criterion (Craven and Wahba 1979) for the appropriate selection of the smoothing parameter λ. For example, O’Sullivan et al. (1986) and Gu (1992) proposed versions of the GCV criterion for non-Gaussian data focused mainly on the penalized maximum likelihood for distributions in the exponential family, while Wei (2005) examined the asymptotic properties of the criterion of robust cross validation based on M -estimation procedures. In this work we choose the smoothing parameter minimizing the weighted cross validation criterion as defined by O’Sullivan et al. (1986) as Pn c 1/2 (I − H c (λ))Y k2 /n kW 1 i=1 τbi (Yi − gbλ (xi ))2 W = , (14) V (λ) = 2 2 (λ))/n} n {tr(I − H W {tr(I − H c c (λ))/n} W b and g bλ = (b with τbi = E(τi |Yi , θ) gλ (x1 ), . . . , gbλ (xn ))> = H W c (λ)Y , where the prediction matrix assumes the form H W (λ) = B(B > W B + λK > K)−1 B > W .

(15)

As Gu (1992) suggested, it is possible to alternate the minimization of (14) with the steps of the penalized EM algorithm described in equations (6) and (9)-(10) or (9)-(13), whichever applies.

Influence diagnostics for robust P-splines

9

Remark 2 The convergence properties of the penalized EM algorithm proposed in Section 2.1 have been studied in the general context of penalized log-likelihood estimation for λ fixed by Green (1990). However, for varying λ Gu (1992), Xiang and Wahba (1996) and Gu and Xiang (2001) among others, have discussed that choosing the smoothing parameter via the indirect method described above may not converge. In our implementation, we follow the suggestions given by Gu (1992). Thus, we not found such problems in our numerical experiments. In addition, can be stressed that the WGCV criterion defined in (14) is similar to the robust GCV used by (Tharmaratnam et al. 2010, Eq. 2.18). In fact, following Lucas (1997) the relationship between the b λ as the criteria WGCV and robust GCV can be highlighted by defining a solution to n X λ > > a K Ka, min ρ(Di2 ) + a 2φ i=1 where ρ(D2 ) = − log f (y; a, φ), with f (y; a, φ) the density function obtained from Equation (4). Furthermore, we can also note that the matrix H W c (λ) have the same diagonal elements than the following matrix c 1/2 B(B > W c B + λK > K)−1 B > W c 1/2 , W which is analogous to the prediction matrix defined by Tharmaratnam et al. (2010). The implementation available in the heavy package uses two nested singular value decompositions in order to efficiently evaluate the weighted GCV criterion, details are presented in the Appendix A of the supplementary material.

3 Influence Diagnostics Below we describe two of the main procedures to determine the influence of outlying observations. We consider diagnostic measures suitable for models with incomplete data, based on the PML estimation using the penalized EM algorithm. Firstly we present the approach of case deletion using the generalized Cook distance (Zhu et al. 2001). Subsequently, we develop the diagnostic using the local influence method proposed by Zhu and Lee (2001). The proofs of Propositions 1 to 5 are deferred to Appendix B of the supplementary material.

3.1 Case deletion measures To evaluate the effect of dropping the ith observation on the PML estimation of the p∗ -dimensional parameter vector θ, it is possible to use the Cook distance, defined as b−θ b(i) )> M (θ b−θ b(i) ), Ci = (θ i = 1, . . . , n,

10

Felipe Osorio

b(i) represents the PML estimate of θ once the ith observation has been where θ dropped from the dataset and M is a positive definite matrix of order p∗ × p∗ (see Cook and Weisberg 1982). Zhu et al. (2001) proposed the generalized Cook distance for models with incomplete data as defined by b−θ b(i) )> {−Q b θ)}( b θ b−θ b(i) ), ¨ λ (θ| GCi = (θ

i = 1, . . . , n,

> b θ) b = ∂ 2 Qλ (θ|θ)/∂θ∂θ b ¨ λ (θ| where Q |θ=θb. To reduce the computational burden b involved in calculating θ (i) , i = 1, . . . , n, the following one-step approximation has been proposed (Cook and Weisberg 1982; Zhu et al. 2001)

b1 = θ b + {−Q b θ)} b −1 Q˙ λ(i) (θ| b θ), b ¨ λ (θ| θ (i)

i = 1, . . . , n,

where b = E[`c (θ; Y com(i) )|Y (i) , θ] b − λJ(θ), Qλ(i) (θ|θ) > > with Y com(i) = (Y > being the complete data vector when the ith (i) , τ (i) ) b θ) b = ∂Qλ(i) (θ|θ)/∂θ| b observation has been deleted and Q˙ λ(i) (θ| θ=θb. For a wide P n b b variety of statistical models it is possible to write Qλ (θ|θ) = i=1 Qλ,i (θ|θ). P b b Thus, Qλ(i) (θ|θ) = Qλ,j (θ|θ), in which case we have j6=i

b θ) b + Q˙ λ,i (θ| b θ) b = Q˙ λ (θ| b θ) b = 0, Q˙ λ(i) (θ| and consequently we consider the following one-step approximation for the generalized Cook distance bb b θ), b ¨ b b −1 Q˙ λ,i (θ| GCi1 = Q˙ > λ,i (θ|θ){−Qλ (θ|θ)}

i = 1, . . . , n.

The following proposition gives the analytical form of the Hessian matrix for the penalized splines under heavy-tailed distributions discussed in Section 2. Proposition 1 For the model given in Equation (4) from Section 2.1 the b function (p + 1) × (p + 1) Hessian matrix associated with the penalized Qλ (θ|θ) b evaluated at θ = θ assumes the form ! ! >c > b ¨ λ (b 1 Q a | θ) 0 B W B + λK K 0 b θ) b = ¨ λ (θ| Q b θ) b = − φb b . ¨ λ (φ| 0 Q 0 n/(2φ) To obtain the generalizedPCook distance in the model described in Section b = n Qλ,i (θ|θ), b with 2.1, we consider Qλ (θ|θ) i=1 n o b = − 1 log φ + 1 τbi (Yi − b> a)2 + λ kKak2 . Qλ,i (θ|θ) i 2 2φ n Using the Proposition 1 follows immediately that the one-step approximation b i= for the generalized Cook distance is given as GCi1 = GCi1 (b a) + GCi1 (φ), 1, . . . , n, where b b −1 Q˙ λ,i (b b ¨ λ (b GCi1 (b a) = Q˙ > a|θ){− Q a|θ)} a|θ), λ,i (b b = Q˙ > (φ| b θ){− b b θ)} b −1 Q˙ λ,i (φ| b θ), b ¨ λ (φ| GCi1 (φ) Q λ,i

(16)

Influence diagnostics for robust P-splines

11

with n o λφb > b = − 1 τbi (Yi − b> a b b )b + Q˙ λ,i (b a|θ) K K a , i i n φb o n λ 2 2 b θ) b = − 1 − 1 τbi (Yi − b> a b b . ) + kK a k Q˙ λ,i (φ| i n 2φb 2φb2 b given in (16) offer an interesting interpreThe distances GCi1 (b a) and GCi1 (φ) 1 a) allows to assess the influence of the ith observation tation. In fact, GCi (b b In addition, these on the PML estimate of a and analogously for GCi1 (φ). measures complement and extend the results developed by Eubank and Gunst (1986); Kim (1996); Wei (2004) and Ibacache-Pulgar and Paula (2011).

3.2 Local influence The local influence method proposed by Cook (1986) allows studying the effect produced by introducing small perturbations on the model and/or the data. The procedure was extended by Zhu and Lee (2001) to manipulate situations with incomplete data. They were focused on assessing the local behavior of the Q-displacement function given by b θ) b − Qλ (θ(ω)| b b fQ (ω) = 2{Qλ (θ| θ)}, where ω = (ω1 , . . . , ωq )> is a vector of perturbations restricted to some open b subset Ω ⊂ Rq and in the context of this work, θ(ω) denotes the PML estimate of θ based on b = E[`c (θ, ω; Y com )|Y obs , θ] b − λJ(θ). Qλ (θ, ω|θ) It is assumed that there is a vector of null perturbation ω 0 ∈ Ω which satisfies `o (θ, ω 0 ; Y obs ) = `o (θ; Y obs ) and `c (θ, ω 0 ; Y com ) = `c (θ; Y com ). b and θ(ω) b The objective of the local influence technique is to compare θ by studying the local behavior of γ(ω) = (ω > , fQ (ω))> around ω 0 (Cook 1986; Zhu and Lee 2001). Consider ω = ω 0 + εh, where h, khk = 1 is an unitary direction and ε ∈ R. Zhu and Lee (2001) used the same reasoning developed by Cook (1986) and showed that the normal curvature Ch (θ) can be employed to characterize the local behavior of fQ (ω 0 + εh) around the value ε = 0 for a direction h, given by b θ)} b −1 ∆(ω 0 )h, ¨ λ (θ| Ch (θ) = 2 h> ∆> (ω 0 ){−Q > b where ∆(ω) = ∂ 2 Qλ (θ, ω|θ)/∂θ∂ω |θ=θ(ω) . b The direction of maximum curvature hmax , determined by the vector assob θ)} b −1 ∆(ω 0 ) ¨ λ (θ| ciated with the largest eigenvalue of the matrix F = ∆> (ω 0 ){−Q is used to identify how to perturb the postulated model to obtain the greatest local change in the Q-displacement function.

12

Felipe Osorio

Several authors have proposed examining other relevant directions to investigate local influence. For example Escobar and Meeker (1992) suggested considering the index plot of Ci (θ) = Chi (θ), i = 1, . . . , n, where hi is a q×1 vector with one in the ith position and zeros elsewhere. In fact, Ci (θ) allows the evaluation of the influence of the ith observation due to the aggregated contribution of all the basic perturbation vectors (Poon and Poon 1999). Poon and Poon (1999) proposed the conformal normal curvature Bh (θ) = Ch (θ)/{tr(T 2 )}1/2 to avoid invariance problems under uniform changes of scale. The conformal normal curvature satisfies that 0 ≤ Bh (θ) ≤ 1, a property that allows comparison of curvatures obtained by considering different SMN models. Following Thomas (1991) and Shi and Wang (1999) it is possible to determine the observations that have a strong impact on the smoothing parameter b selection in penalized splines, that is λ(ω), obtained by introducing a small perturbation ω ∈ Ω on the WGCV criterion given in (14). It is assumed that b 0 ) = λ. b The dithere is an ω 0 ∈ Ω vector of no perturbation, such that λ(ω b rection of the greatest local change is hmax (V ) ∝ ∂ λ(ω)/∂ω, which should be b evaluated at ω 0 . Since λ(ω) is chosen minimizing a perturbed version of the WGCV criterion, we have ∂V (λ, ω)/∂λ|λ=λ(ω) = 0. Differentiating both sides b of this equation with respect to ω and evaluating this derivative at ω 0 , gives o n ∂ 2 V (λ, ω) ∂ 2 V (λ, ω) ∂ λ(ω) b + = 0. b ∂ω∂λ ∂λ2 ∂ω ω=ω0 ,λ=λ Thus, n  ∂ 2 V (λ, ω) −1 ∂ 2 V (λ, ω) o b ∂ λ(ω) = − . b ∂ω ω=ω0 ∂λ2 ∂ω∂λ ω=ω0 ,λ=λ

(17)

Below we consider the scale and response perturbation schemes. Each scheme was applied on the complete data penalized log-likelihood function and the WGCV criterion. In order to find hmax we must to compute the curvature b θ)} b −1 ∆(ω 0 ), where Q b θ) b was given in Propo¨ λ (θ| ¨ λ (θ| matrix F = ∆> (ω 0 ){−Q sition 1 and for each perturbation scheme ∆(ω) assumes the partitioned form > > ∆(ω) = (∆> a (ω), ∆φ (ω)) , with ∆a (ω) =

b ∂ 2 Qλ (θ, ω|θ) , ∂a∂ω >

∆φ (ω) =

b ∂ 2 Qλ (θ, ω|θ) . ∂φ∂ω >

We should emphasize that the ∆(ω) matrix is specific for the perturbation scheme under consideration. Propositions 2 and 3 present analytic expressions for the matrix ∆(ω 0 ), while Propositions 4 and 5 give explicit formulas for b ∂ λ(ω)/∂ω when the WGCV criterion is perturbed. 3.2.1 Scale perturbation on complete-data penalized log-likelihood This perturbation scheme is defined by introducing weights in the scale, that is, the following distributional assumption is considered ind

Yi (ω) ∼ SMN (b> i a, φ/ωi ; H),

i = 1, . . . , n,

(18)

Influence diagnostics for robust P-splines

13

where ω = (ω1 , . . . , ωn )> , with ωi > 0 for i = 1, . . . , n. In this case, the vector of null perturbation is ω 0 = 1n , with 1n = (1, . . . , 1)> . The perturbed log-likelihood function for the complete data model assumes the form `c (θ, ω; Y com ) = −

1 n log φ − (Y − Ba)> W 1/2 diag(ω)W 1/2 (Y − Ba), 2 2φ

where diag(ω) = diag(ω1 , . . . , ωn ) is a diagonal matrix, whose diagonal elements are given by the vector ω. Then the following proposition holds. Proposition 2 For the penalized splines considering heavy-tailed distributions, under the scale perturbation scheme defined in Equation (18) and when θ = (a> , φ)> are the parameters of interest, the ∆(ω 0 ) matrix can be written > > as ∆(ω 0 ) = (∆> a (ω 0 ), ∆φ (ω 0 )) , where ∆a (ω 0 ) =

1 >c B W diag(e), φb

∆φ (ω 0 ) =

1 >c e W diag(e), 2φb

with e = Y − Bb aλ being the residual vector. A direct consequence of Proposition 2 is that the curvature matrix F can be written as F = F a + F φ , with b −1 ∆a (ω 0 ) ¨ a|θ)} F a = ∆> a (ω 0 ){−Qλ (b 1 c B(B > W c B + λK > K)−1 B > W c diag(e), = diag(e)W φb and ¨ b b −1 ∆φ (ω 0 ) = F φ = ∆> φ (ω 0 ){−Qλ (φ|θ)}

1 c ee> W c diag(e). diag(e)W 2n

This perturbation scheme is equivalent to the case-weights perturbation, where weights are introduced with the aim to detect which observations have a prominent contribution on the residual sum of squares, RSSW (a). Indeed, the case-weights (or scale) perturbation generalizes the concept of influence by means of cases-deletion (see, for instance Thomas 1991). 3.2.2 Response perturbation on complete-data penalized log-likelihood This scheme is defined by introducing additive perturbations in the observed responses as Y (ω) = Y +ω, where ω = (ω1 , . . . , ωn )> and ω 0 = 0 denotes the vector of null perturbation. Under this perturbation scheme, the conditional expectation of the complete data penalized log-likelihood function is given by b = − n log φ − 1 [(Y (ω) − Ba)> W c (Y (ω) − Ba) + λa> K > Ka]. Qλ (θ, ω|θ) 2 2φ To obtain an explicit formulae of the ∆(ω 0 ) matrix under response variable perturbation consider the following proposition.

14

Felipe Osorio

Proposition 3 For penalized splines assuming the class of scale mixture of normal distributions, under the response perturbation scheme and considering that θ = (a> , φ)> are the parameters of interest, then the ∆(ω 0 ) matrix can > > be expressed as ∆(ω 0 ) = (∆> a (ω 0 ), ∆φ (ω 0 )) , where ∆a (ω 0 ) =

1 >c B W, φb

∆φ (ω 0 ) =

1 >c e W, φb

and e = Y − Bb aλ is the residual vector. Note that when φ is known, the curvature matrix for the response perturbation scheme assumes the form ¨ b b −1 ∆a (ω 0 ) F = ∆> a (ω 0 ){−Qλ (θ|θ)} 1 c c B + λK > K)−1 B > W c = 1W c H c (λ). = W B(B > W W φ φ bλ That is, this perturbation scheme is related to the generalized leverage of a (Wei et al. 1998), GL(b aλ ) = ∂b g λ /∂Y > = H W c (λ). In fact, for a fixed or known φ, it is possible to study the influence of extreme observations on their own fitted values using the index plot of Bi (a) ∝ b hii (λ), i = 1, . . . , n where b (λ) matrix. hii (λ) denotes the ith diagonal element of the H W c 3.2.3 Scale perturbation on the smoothing parameter selection To evaluate the effect of outlying observations on the smoothing parameter selection, we consider the perturbation scheme defined in (18), In this way, the perturbed WGCV criterion assumes the form V (λ, ω) =

RSSW c (λ, ω)/n , 2 {tr(I − H W c (λ, ω))/n}

(19)

−1/2 c 1/2 (I − H c (λ, ω))Y k2 , and (ω)W where RSSW c (λ, ω) = k diag W −1 >c c. HW (ω)B + λK > K)−1 B > W c (λ, ω)) = B(B W diag

b Note that the first term of ∂ λ(ω)/∂ω given in Equation (17) is a scalar that can be ignored, thus hmax (V ) ∝

∂ 2 V (λ, ω) . ∂ω∂λ ω=ω0 ,λ=λb

Based on (19), for the smoothing parameter selection with the scale perturbation scheme, the following proposition gives the specific form of the direction of largest local curvature.

Influence diagnostics for robust P-splines

15

Proposition 4 For the smoothing parameter selection procedure based on the WGCV criterion, under the scale perturbation, the second derivative of V (λ, ω) b can be expressed as with respect to ω and λ evaluated at ω = ω 0 and λ = λ 3 ∂ 2 V (λ, ω) c ) ∂V (λ, ω) = − tr(GW b ∂ω∂λ ω=ω0 ,λ=λb c ∂ω ω=ω0 ,λ=λ  1 n1 c ) diag(e)W c (I − 2H)e c + c diag(e)W c (I − 2H)G c W cY + 3 tr(GW nc n  c GW c e + diag(GW c Y )W c (I − 2H)e c + 2 diag(e)W o 2 >c b c c c c c , ( λ, ω ) dg((I − 2 H)G W )1 − 2 dg((I − H) H)1 e W G W Y − RSSW 0 n n c n where ∂V (λ, ω) 1 n c (I − 2H)e c = 3 c diag(e)W b ∂ω nc ω=ω0 ,λ=λ o 2 b c c + RSSW c (λ, ω 0 ) dg((I − H)H)1 n

(20)

with dg(Z) = diag(z11 , . . . , znn ) for Z = (zij ) a square matrix of order n × b G = BS −1 K > KS −1 B > where c c = H c (λ), H n, while c = 1 − tr(H)/n, W b > K, e = (I − H)Y c B + λK c S = B>W and 1n = (1, . . . , 1)> denotes an n-dimensional vector of ones. 3.2.4 Response perturbation on the smoothing parameter selection To perturb the response variable we consider Y (ω) = Y + ω, with ω = (ω1 , . . . , ωn )> where the vector of null perturbation is ω 0 = 0. The perturbed WGCV criterion assumes the form V (λ, ω) =

RSSW c (λ, ω)/n , 2 {tr(I − H W c (λ))/n}

c 1/2 (I − H c (λ))Y (ω)k2 . The following proposition where RSSW c (λ, ω) = kW W b ω 0 )/∂ω∂λ. provides an explicit expression for ∂ 2 V (λ, b the selected value of the smoothing parameter according Proposition 5 Let λ the procedure described in Section 2.2. For the response perturbation scheme we have ∂ 2 V (λ, ω) hmax ∝ ∂ω∂λ ω=ω0 ,λ=λb n o 2 c >W c GW c +W c GW c (I − H) c − 2 tr(GW c )(I − H) c >W c (I − H) c Y, = 2 (I − H) nc nc b and G = BS −1 K > KS −1 B > with c c = H c (λ) where c = 1 − tr(H)/n, H W >c > b S = B W B + λK K.

16

Felipe Osorio (c) ● ● ●





● ●

● ●● ● ● ●







● ●

● ●

● ● ● ●







● ●

● ● ● ●



● ●

● ● ● ●



● ● ● ●

● ●







●●

● ●●

2.5

●● ●● ●



● ●

●● ●



● ● ● ● ●

● ● ●

● ● ●



● ●●





● ● ●







● ● ●



● ●

● ● ●

● ● ●



●●



● ●

●●



● ●

●●





● ● ●



● ● ● ● ●



● ●



● ●

● ● ●● ●

● ● ● ●●●







● ● ●







● ● ● ●



● ● ● ●

−1

●●

● ●

● ●



●●

0.5





● ●

● ●

0.0

● ●● ● ●

y



0.5









0.0

y

0









● ●●









●● ● ● ●● ●



● ●



● ●



2.0

● ●

● ● ●

y









1.5







1.0

1

● ● ●





● ●





● ● ● ●



● ● ● ● ● ●● ● ● ● ● ●● ●



● ●



● ● ●

1.0

2



3.0

(b) 1.5

(a)

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●

● ●

● ●



● ●

−2

● ●



0.0

0.2

0.4

0.6

0.8

1.0

0.0

● ●●

0.2





−0.5





0.4

x

0.6

0.8

1.0



0.0

●● ●

0.2

x

0.4

0.6

0.8

1.0

x

Fig. 1: Typical dataset for: (a) Cantoni and Ronchetti (2001), (b) logistic and (c) “bump” test functions

As we shall see in the confirmatory analysis for life expectancy data, outliers can be extremely influential on the selection of the smoothing parameter (see Table 2). Thus, these perturbation schemes applied on the weighted GCV criterion allow us to note that it is necessary that both, the fitting procedure as the smoothing selection technique should be based on a robust approach. Moreover, the influence measure hmax obtained in Propositions 4 and 5 generalizes the work of Thomas (1991).

4 Numerical experiments In this section we evaluate the performance of the proposed methodology through a simulation study and the analysis of life expectancy data introduced by Leinhardt and Wasserman (1979). Additional experiments are reported in Appendices C and D from the supplementary material.

4.1 Simulation study For our simulation study, we considered the model Y = g(x) + σ, with the following test functions: g1 (x) = sin(2π(1 − x)2 ), 1 , g2 (x) = 1 + exp(−20(x − 1/2))

σ = 0.5,

g3 (x) = x + 2 exp(−(16(x − 1/2))2 ),

σ = 0.3.

σ = 0.2,

Function g1 was studied by Cantoni and Ronchetti (2001), Lee and Oh (2007) and Tharmaratnam et al. (2010), while functions g2 (logistic) and g3 (“bump”) where considerated by Ruppert (2002) is his simulation study. For each of these functions M = 500 datasets were generated. The sample size

Influence diagnostics for robust P-splines

17

in each case was n = 100 with design points x1 , . . . , xn generated independently from the uniform distribution U(0, 1), the random disturbances {i } were generated from the contaminated normal distribution (1 − δ)N (0, 1) + δN (0, γ), for δ = 0%, 5%, 10%, 25%, 40% and γ = 2, 4, 10. We applied P-splines using B-splines of third-degree and second-order of penalty, with selected knots dividing the domain of x into 20 segments of equal width. The normal, slash and Student-t distributions, associated with the H distribution following point mass at τi , Beta and Gamma, respectively were considered. The degrees of freedom of the slash and Student-t were fixed at 2 and 4, respectively. To gain more insights into the performance of the proposed procedure, the mean squared error (MSE) for each simulated dataset j (j = 1, . . . , M ), was calculated as n 1X (g(xi ) − gbj (xi ))2 , j = 1, . . . , M, M SEj = n i=1 where the smoothing parameter λ was chosen according to the strategy outlined in Section 2.2. Figure 1 presents a typical dataset for the case in which the data have not been contaminated. As well, the real underlying function is presented for the three test functions considered. The results of the simulation study are presented in Figures 2 to 7. The results for γ = 2 are omitted because they are similar to those obtained with γ = 4. As expected, when there is no contamination, the estimated curves using heavy-tailed distributions are essentially equivalent to those obtained under Gaussian errors. However, as the percentage of contamination increases the estimation under normality worsens, while with heavy-tailed distributions the adjustment remains robust against outlying observations. An interesting result in relation to the g3 “bump” function is presented in Figures 6 and 7, where it is evident that the protection against outliers offered by the use of heavy-tailed distributions improves only for severe contaminations.

4.2 Life expectancy data To illustrate the estimation procedure and influence diagnostics described in Sections 2 and 3, we considered the life expectancy dataset introduced by Leinhardt and Wasserman (1979), who reported on per capita income in US dollars and life expectancy for 101 countries in 1979. Thomas (1991) studied the local influence on the GCV criterion in smoothing splines and identified observations 9, 15 and 27 as the most influential using a scale perturbation scheme. Figure 8 and Table 1 display the results of the fitted model considering normal, slash and Student-t distributions. For this dataset, we observe that the estimation procedure under distributions with heavier tails than the normal produce an fitted model that is insensitive to outlying observations. Although

18

Felipe Osorio

0.6 0.5

0.7

0.6 ●

0.5

0.6



0.5

slash

0.7

Student−t

0.7

normal



0%

5%

10%

25%

0.4

MSE

● ●

0.3

0.4

MSE

0.3



● ●

0.2

● ●

● ● ● ●

40%

0%

5%

10%

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ● ●

● ● ● ● ● ●

0%

5%

10%

● ● ● ● ● ● ● ● ●

0.0

● ● ● ● ● ●

● ● ● ● ● ● ● ●

0.1

● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ●

0.0

● ● ● ● ● ●

● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ●





0.1





0.0

0.1

0.2



● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

0.2

0.4



0.3

MSE

● ●

25%

40%

25%

40%

Fig. 2: Boxplots of MSE under several contamination levels (δ) and variance inflation factor γ = 4 considering the three distributional assumptions for the Cantoni and Ronchetti (2001) test function. normal

Student−t

slash

4

4



4





3

3

3



● ● ●



● ●

5%

2

MSE

● ● ●





1

● ● ●

● ● ●

● ●





● ● ● ● ● ●

● ● ●

● ●

0%

10%

25%







0

0

● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ●



● ●

40%

● ● ●

● ●

● ●

● ●

0%

5%

10%

25%

0

1



● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

2

● ● ● ●

MSE

● ●



● ● ● ● ● ● ● ● ● ● ●

1

2

MSE

● ●

40%

● ●

● ●

● ● ●

0%

5%

10%

● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ● ●

25%

● ● ●

40%

Fig. 3: Boxplots of MSE under several contamination levels (δ) and variance inflation factor γ = 10 considering the three distributional assumptions for the Cantoni and Ronchetti (2001) test function.



0.12

slash

0.12

Student−t

0.12

normal

0.02

● ● ●

0%

5%

10%

25%

40%

0.10 0.06

MSE

0.08

0.10 0.08

● ● ●

● ● ● ● ● ● ● ● ● ● ●

● ● ●

● ● ● ●

0%

5%

10%

● ● ● ● ● ●



0.04



● ●

● ● ● ● ● ● ● ● ● ● ●

0.00

● ● ● ● ● ● ●

0.00

● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ●

● ● ●

● ● ● ●

● ● ● ● ● ● ● ●

● ● ● ● ● ●

0%

5%

10%

● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ●

0.00



● ●



0.02

0.04

● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

0.02

● ● ●



MSE



0.06

0.08

● ●

0.06

MSE



0.04

0.10





25%

40%

25%

40%

Fig. 4: Boxplots of MSE under several contamination levels (δ) and variance inflation factor γ = 4 considering the three distributional assumptions for the logistic test function.

Influence diagnostics for robust P-splines

19

0.5 0.4

slash

0.5



0.4

Student−t

0.5

normal

● ● ● ●

● ● ●



0.1

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

0.0 5%

10%

25%

40%

0.3

MSE





● ●

● ● ●

● ●

0%

5%

10%

● ● ● ● ● ● ● ●

● ● ● ● ●

25%

● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ●



● ●







● ●

0%



0.2

● ● ● ● ● ● ● ● ● ● ●

0.1

● ●

0.0





0.1

0.2



● ● ● ● ● ● ●

0.2

MSE



● ●

● ● ● ● ●



0.0

0.3

● ●

0.3



MSE

0.4





40%

● ● ●

● ● ●

● ● ● ● ● ●

0%

5%

10%

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ●

25%

40%

Fig. 5: Boxplots of MSE under several contamination levels (δ) and variance inflation factor γ = 10 considering the three distributional assumptions for the logistic test function.



0.20

slash

0.20

Student−t

0.20

normal

● ● ●

● ●

● ● ●

0%

5%

10%

25%

40%

0.15

● ● ● ● ● ● ● ●

MSE



● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ●

0.05

● ● ● ● ●

● ●

● ● ● ● ● ● ●

● ● ● ●

0.00

0.00

● ● ● ●

● ● ●

● ● ● ● ● ● ● ●

● ● ● ● ● ● ●

5%

10%

0.00



● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ●

● ● ● ● ● ●

● ● ● ● ● ● ●

0.10

0.15

MSE

● ● ● ● ● ● ● ●



● ●

0.05

MSE

0.10





0.05

● ● ● ● ● ● ●

0.10

0.15

● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ●

0%

5%

10%

25%

40%

0%

25%

40%

Fig. 6: Boxplots of MSE under several contamination levels (δ) and variance inflation factor γ = 4 considering the three distributional assumptions for the “bump” test function.

● ●

1.5

slash

1.5

Student−t

1.5

normal



1.0

1.0

1.0

● ●





● ● ● ●



0.0

0.0

● ●

5%

10%

25%

MSE



40%



● ● ● ● ● ●

● ● ● ● ● ●

0%

5%

10%



● ● ● ● ● ● ●



● ● ● ● ● ● ● ●

25%

40%

● ● ● ●





● ●

● ● ●

● ● ● ● ●

0%

● ●

0.0

0.5

● ● ●

● ● ● ● ● ● ● ● ●



0.5

● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ●

MSE



0.5

MSE

● ● ●

● ●

● ● ●

● ●

● ● ● ●

● ● ● ● ●

0%

5%

10%

● ●

● ● ● ● ● ●

25%

40%

Fig. 7: Boxplots of MSE under several contamination levels (δ) and variance inflation factor γ = 10 considering the three distributional assumptions for the “bump” test function.

20

Felipe Osorio

● ●



70



49

● ●



93 ●

● ●









● ●● ●

● ●



● ●

●● ●

60

● ● ● ● ●

● ●

50

●●

40

Life Expectancy

● ● ●●

●● ● ●● ● ●● ●● ● ●●● ● ●● ● ● ●● ●

25 ●

23 ●

● ● ● ● ● ● ●● ● ●● ● ● ●

27 ●

●● ● ● ● ● ●● ● ●● ●

58

30

● ● ●



0

1000

2000

3000

4000

5000

Per Capita Income

Fig. 8: Life expectancy data with fitted curve under three distributional assumptions: (—) normal, (- -) Student-t and (· · · ) slash models. Table 1: Estimation summary for the life expectancy data under three fitted models. Model

νb

Normal Slash Student-t

— 1.0966 2.5875

b λ

b Y obs ) `λ (θ;

WGCV

iterations

time (sec.)

0.1021 2.6916 4.1905

-325.4395 -327.9674 -326.8620

48.0393 12.8262 18.9559

3 104 79

0.002 0.080 0.071

there are differences in the estimated values for the degrees of freedom (b ν ), b and the WGCV criterion, it can be noticed that the smoothing parameter (λ) the estimated curves gb for the slash and Student-t models are quite similar. It should be stressed that the routine heavyPS from the heavy package requires less than a tenth of a second to carry out the computations on an iMac 2,12 Intel Quad Core i5 at 3.1 GHz and 16 GB of RAM. It is possible to identify outliers in a simple manner by considering the 2 index plot of Mahalanobis distances Di2 (θ) = (Yi − b> i a) /φ, i = 1, . . . , n (Lange and Sinsheimer 1993). Under normality, Di2 (θ) follows a chi-square distribution with one degree of freedom. We use the quantile value χ21 (ξ) with ξ = 0.975 to obtain the cutoff shown in the graph in the first panel of Figure 9. This suggests that under normal errors, observations 49, 58 and 93 are outliers. The other panels in Figure 9 indicate that when distributions with heavier tails than the normal are used the methodology allows the accommodation of outlying observations (compare with Figure 8) by attributing small weights in the estimation procedure. Indeed this property is related with the influence

Influence diagnostics for robust P-splines

21

1.0



1.0 0.8



● ● ● ● ● ● ● ● ● ● ● ● ● ● ●



● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ●

● ●

● ●

0.2 ●

● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ●● ● ● ●● ●● ● ● ● ●● ● ● ● ●● ●● ● ● ●●● ● ● ● ● ●● ● ● ●



58 ●

40

60

80

0

10

30

40

0

10

20

Distances

Index

93

27





27



20

25

● ●

23

93



100

58 ●

25

● ●

23



20

● ●

● ●

0.0





0.2

2













● ● ●●●●●●●●●● ●●●●● ●● ●

● ●

0.4



● ●

0

Weights



0.8

Weights

49 ● ●

● ●

0.6

6

8

● ● ● ● ●



0

slash

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

0.6

10

58

4

Mahalanobis distances

1.2

12



0.4

1.4

Student−t 93

30

40

50

60

Distances

Fig. 9: Index plot of the Mahalanobis distances under normality and estimated weights versus distances for the Student-t and slash models. Student−t

slash

2.5

2.5

3.0

3.0 2.5



3.0

normal 27

0.0 0

20

40

60

Index

80

100

2.0 1.5

Cook distances

1.0 0.5

0.5

● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ●● ●●● ● ●● ●● ●●●● ●● ● ●● ●● ● ●●●●● ●● ●●●●●●●●● ●●● ●●●● ●●● ●

0.0



2.0





● ● ● ● ● ●● ●●●●●●●● ● ● ●● ● ●●

● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ●●● ●● ●●● ● ● ●● ● ●●●●●●● ●●● ● ●● ●●● ● ● ● ●● ●●● ●●● ●● ● ●●●●● ●●●● ●●● ●●●●● ● ● ● ●●● ●●●●●●●●●●●● ●●●

0

20

40

60

80

100

0.0



1.5

Cook distances

35



0.5

23



1.0

1.5

2.0



1.0

Cook distances

25

● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ● ●●●●●●● ●●●●●● ●● ●●● ● ●● ●● ●●● ●●● ●● ● ●●●● ●●●● ●●● ●●●●●●● ● ●● ●●● ●●●●●●●●●●●● ● ●●●

0

20

Index

40

60

80

100

Index

Fig. 10: Index plots of generalized Cook distances.

function of the PML estimation defined by Equation (5) (see Butler et al. 1990; Lucas 1997). The weights associated with the normal distribution (b τi = 1, for i = 1, . . . , n) are indicated in these panels as a dotted line. To identify influential observations in the life expectancy dataset we applied the influence diagnostic methods proposed in this work. Figure 10 presents the index plot for the generalized Cook distances, GCi1 , for i = 1, . . . , n, considering the three fitted models. Observations 25 and 27 exercise a strong influence on the PML estimates with slightly less influence for observations 23 and 35. The influence of those observations decreases considerably when heavy-tailed distributions are used such as slash and Student-t distributions. When we consider the assessment of local influence applied to the complete data penalized log-likelihood function (Figures 11 and 12) we observe that under normality, observations 23, 25, 27, 58 and 93 are influential against the scale perturbation scheme, while the response perturbation scheme indicates that the fitted model is particularly sensitive when observations 27, 58, and 93 are perturbed. The group of highlighted points in the center panel may be because a property related to the weights definition in the Student-t distribution (Kent et

22

Felipe Osorio Student−t 0.8

slash

0.8

0.8

normal

27

0.6 0.4

hmax

0.6 93

0.4

hmax

0.4

hmax

0.6





23 ●

58

● ●●

0.2

0.2



●●







● ●

20

40

60

80

● ● ● ● ● ●● ● ●●●●● ● ●● ●●● ●● ●

0

100

● ●



●● ●● ●

20



● ●







●●











● ●

●●



●●

● ●

40





60

●●

● ●

80

●● ● ●● ●



● ● ● ● ●● ●● ● ●● ●

● ● ●

100

● ● ● ●● ●●●●● ● ●●●●●● ●●● ●

0

● ●

20



● ●

●●





●●

●●

● ●

● ●●



● ● ●● ● ●● ●



60





● ●



● ● ●







40

Index

Index





● ●



● ●



● ● ●



●● ●●



● ●● ●



● ● ●





●●





● ● ● ● ● ● ●● ●●●● ● ● ●● ●● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ●●● ● ●● ●● ●●●●●●●●●●●●●●●● ●● ● ● ● ●● ● ● ● ●● ●● ● ●●●●● ●●● ● ● ●

0





● ● ●● ●





0.0

0.0





●●

● ●

●●

● ●







●●●





● ●

0.2

● ●

0.0



25

●●

80

100

Index

Fig. 11: Index plots of hmax under scale perturbation on the penalized loglikelihood function. Student−t 0.4

slash

0.4

0.4

normal

93

0.3 ●



0.0

● ●● ● ●

●●

● ●● ● ● ● ● ● ●



● ●



● ● ●● ●



20

40

60

● ● ●



●● ●

●●

● ● ●●

80





0

100

● ● ● ●●● ● ●●● ● ● ● ● ●●● ● ●● ●●● ● ●●●●● ●● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ●● ●● ● ●●

● ●





20

40

60

80



● ● ●● ●

100

● ●●●

● ●



0

● ●



● ●●



● ●

● ●







● ●





40





60

Index

Index

● ●

● ● ●



●●

20



● ●



● ●





● ● ●●● ● ●●

● ● ●

● ●











● ●



●●

● ● ●● ● ● ●

●● ●

● ●

● ● ●







●●



●● ●

●●



● ●







● ● ●





● ●

● ● ●

● ●

0.1









●●







● ●



● ●



● ●







● ● ● ●

● ● ● ●● ●● ●● ●







● ●

0.1





0.0

0.1



●●







● ● ●







● ●

● ●



●●●







● ● ● ●

0

● ●

hmax





0.2

hmax

0.2

hmax







●● ● ●









0.2

27



0.0

0.3



0.3



58

80

100

Index

Fig. 12: Index plots of hmax under response perturbation on the penalized log-likelihood function. Student−t 0.6

slash

0.6

0.6

normal 27

0.5

0.5

0.5



15



0.4 ● ●

● ● ●●

● ●

●● ● ●

●●

● ● ●● ● ● ● ●

● ● ●

● ●● ● ●

● ● ●

40

60

Index

●● ● ● ● ● ● ● ●● ●●● ● ● ● ●● ● ● ● ●● ● ●● ● ● ● ●● ● ●

80

100



0.2





● ● ●●●



● ●

● ●



20

● ●









● ●

● ● ● ●



40

●●

● ● ●

60

Index

● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ●● ● ●●●●● ● ● ● ●

80

100

● ●● ●

● ● ● ●● ● ●● ● ● ● ● ● ●●● ●

20

● ● ●

●●



● ●

● ●

0

●● ●●





● ●



●●

● ● ● ●



0



● ● ●



●●

0.1

● ●●● ●● ●

● ●

● ● ● ●



● ●

0.1







20





● ●

● ●

0.0

0.1 0.0 0

● ●



● ●●





● ●

● ● ● ● ● ●● ● ● ● ● ●● ●

●●

0.0

0.2

● ●●

●●



●● ●





0.2



0.3







hmax

0.4 9

0.3

hmax

0.3

hmax

0.4



●● ●

●●

● ●

●● ●●

40

●● ● ●



60

● ●● ●



● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●

● ●

80

100

Index

Fig. 13: Index plots of hmax under scale perturbation on the smoothing parameter selection.

Influence diagnostics for robust P-splines

23

Table 2: Selection of smoothing parameter (with percentage change) for the three fitted models. normal Obs. excluded

b λ

none 23 25 27 58 93 9,15 25,27 9,15,27 23,25,27 9,15,23,25,27 23,25,27,58,93

0.1021 0.1356 2.7099 5.6828 0.0787 0.0805 0.0305 5.9363 5.5264 8.5358 8.3551 4.8414

(change) — 33% 2553% 5464% -22% -21% -70% 5712% 5311% 8257% 8080% 4640%

slash b λ 2.6916 3.0337 2.6519 3.1738 2.4344 2.7566 2.5415 3.0141 2.9576 3.9123 3.3817 3.7382

(change) — 13% -1% 18% -10% 2% -6% 12% 10% 45% 26% 39%

Student-t νb

b λ

1.0966 1.1663 1.1673 1.2792 1.1484 1.2137 1.0647 1.2988 1.2180 1.7378 1.4475 3.0355

4.1905 4.5224 4.0301 4.6736 4.0399 4.1696 4.6219 4.1281 4.5460 4.5902 5.0877 4.5478

(change) — 8% -4% 12% -4% -1% 10% -1% 8% 10% 21% 9%

νb 2.5875 2.7612 2.7524 3.0603 3.4976 2.8930 3.7019 2.8900 3.0576 3.3215 4.1768 4.5478

al. 1994). It is evident that this influence decreases when we use distributions with heavier tails than the normal. Figure 13 presents the influence graphs for the scale perturbation scheme applied to the weighted cross validation procedure. Under errors normally distributed we identify that observations 9, 15 and 27 exercise a strong influence on the selection of the smoothing parameter (see also Thomas 1991). Thus, our results generalize the ones reported by Thomas (1991). Again it can be appreciated that the estimation procedure considering heavy-tailed distributions is an effective approach to accommodate outliers. With the objective of investigating the sensitivity in the selection of the smoothing parameter against outlying observations, a confirmatory study was conducted that consisted of dropping observations from the dataset and obtaining an estimate of λ by minimizing the WGCV criterion. The results were compared with the original estimate. Table 2 presents the estimates and percentages of relative change for each of the fitted models. This analysis reveals the extreme sensitivity in the selection of the smoothing parameter under normal errors. In fact, the highest percentage change (8257%) occurs when observations 23, 25 and 27 are removed. The results evidence the ability of SMN distributions to reduce the influence of extreme observations. The stability that can be noted in the estimation of the degrees of freedom for slash and Student-t distributions reveals appropriate protection against outliers.

5 Discussion This work describes the estimation of parameters and the influence diagnostics in smoothing via penalized splines considering the class of scale mixtures of normal distributions. It also addresses the resistant selection of the parameter that controls the smoothness of the fitted curve. The results of the numerical

24

Felipe Osorio

experiments highlight the ability of the proposed procedure to accommodate outlying data. The estimation procedure can be seen as an alternative approach to the works of Cantoni and Ronchetti (2001), Lee and Cox (2009) and Ibacache-Pulgar and Paula (2011). It should be streesed that, using the Laplace distribution (Phillips 2002) our procedure can tackle the median (L1 ) nonparametric regression (Koenker et al. 1994). The numerical implementation of the estimation procedure using a penalized EM algorithm is simple and computationally efficient. The routines developed are available in the R heavy package (Osorio 2014). Although we have not seen this situation in our numerical experiments, Gu (1992) presented a discussion in which a series of authors indicated that the procedure delineated in Section 2.2 cannot reach convergence. The author is currently working on an alternative form to carry out the selection of the smoothing parameter whose convergence is assured. The proposed methodology can be seen as an M -estimation procedure (see, for instance Maronna 1976; Lange et al. 1989), thus we can expect that the procedure performs well for variance-inflation models (Cook et al. 1982) like the one used in the Monte Carlo simulation study. In an additional simulation study (see supplementary material) we considered an asymmetric contamination scheme varying the percentage of outliers (0%, 5%, 10%, 25% and 40%). This kind of extreme contamination reveals that the statistical modeling using distributions with heavier tails than the normal one is not the panacea for all robustness problems (Lange et al. 1989). It is interesting to note that the penalized EM estimation under heavy-tailed distributions produces curve estimates very similar to those reported by Tharmaratnam et al. (2010) when the shape parameters are fixed at very small values. However, models derived under heavy-tailed symmetric distributions, still can be vulnerable to extreme and influential observations. The role and definition of outlying and influential observations in models with heavier tails than the normal one have not been completely studied. In our opinion an avenue for new developments is to use the mean-shift outlier model (Wei and Shih 1994). Explicit expressions have been developed for the necessary matrices required to diagnose influence considering case deletion techniques and the local influence method. Interestingly, for the example with real data, all the diagnostic techniques yielded complementary results. The evaluation of influence in this work extends the earlier results of, for example, Eubank and Gunst (1986); Kim (1996) and Wei (2004), who considered the deletion methodology, while the study of local influence generalizes the results of Thomas (1991). We plan to develop an R package to implement the influence diagnostics presented in this work, as a complement to the heavy package. The results developed in this work can easily be adapted to the context of the ridge regression, in which the proposed methodology extends the works of Walker and Birch (1988); Billor and Loynes (1999) and Shi and Wang (1999). It is planned to extend the parameter estimation as well as the influence assessment considering distributions with heavier tails than the normal for semiparametric nonlinear mixed effects models according to the approach proposed by Elmi et al. (2011). To reach this objective may require consid-

Influence diagnostics for robust P-splines

25

ering an estimation procedure using a stochastic EM algorithm such as that presented by Meza et al. (2012). Supplemental Materials Supplementary Material includes details about the computational implementation, proofs of Propositions 1 to 5 and additional results obtained from a Monte Carlo simulation study and a real data analysis. Acknowledgements I would like to thank the reviewers for their constructive comments, which helped to substantially improve this manuscript. I am grateful to Victor Leiva for his careful reading and comments on an earlier version of this paper. I also thank Ronny Vallejos and Patricio Videla for their valuable suggestions. The author was partially supported by grants CONICYT 791100007 and FONDECYT 1140580.

References Abramowitz, M., Stegun, I.A. (1970). Handbook of Mathematical Functions. New York: Dover. Andrews, D.F., Mallows, C.L. (1974). Scale mixtures of normal distributions. Journal of the Royal Statistical Society, Series B, 36, 99-102. Billor, N., Loynes, R.M. (1999). An application of the local influence approach to ridge regression. Journal of Applied Statistics, 26, 177-183. Butler, R.J., McDonald, J.B., Nelson, R.D., White, S.B. (1990). Robust and partially adaptive estimation of regression models. The Review of Economics and Statistics, 72, 321327. Cantoni, E., Ronchetti, E. (2001). Resistant selection of the smoothing parameter for smoothing splines. Statistics and Computing, 11, 141-146. Cook, R.D. (1986). Assessment of local influence (with discussion). Journal of the Royal Statistical Society, Series B, 48, 133-169. Cook, R.D., Weisberg, S. (1982). Residuals and Influence in Regression. New York: Chapman & Hall. Cook, R.D., Holschuh, N., Weisberg, S. (1982). A note on an alternative outlier model. Journal of the Royal Statistical Society, Series B, 44, 370-376. Craven, P., Wahba, G. (1979). Smoothing noisy data with spline functions. Numerische Mathematik, 31, 377-403. Dempster, A.P., Laird, N.M., Rubin, D.B. (1977). Maximum likelihood from incomplete data via the EM algorithm (with discussion). Journal of the Royal Statistical Society, Series B, 39, 1-38. Eilers, P.H.C., Marx, B.D. (1996). Flexible smoothing using B-splines and penalties (with discussion). Statistical Science, 11, 89-121. Eilers, P.H.C., Marx, B.D. (2010). Splines, knots, and penalties. WIREs Computational Statistics, 2, 637-653. doi: 10.1002/wics.125 Elmi, A., Ratcliffe, S.J., Parry, S., Guo, W. (2011). A B-spline based semiparametric nonlinear mixed effects model. Journal of Computational and Graphical Statistics, 20, 492-509. Escobar, L.A., Meeker, W.Q. (1992). Assessing influence in regression analysis with censored data. Biometrics, 48, 507-528. Eubank, R.L. (1984). The hat matrix for smoothing splines. Statistics & Probability Letters, 2, 9-14. Eubank, R.L. (1985). Diagnostics for smoothing splines. Journal of the Royal Statistical Society, Series B, 47, 332-341. Eubank, R.L., Gunst, R.F. (1986). Diagnostics for penalized least-squares estimators. Statistics & Probability Letters, 4, 265-272.

26

Felipe Osorio

Fern´ andez, C., Steel, M.F.J. (1999). Multivariate Student-t regression models: pitfalls and inference. Biometrika, 86 153-167. Green, P.J. (1990). On use of the EM algorithm for penalized likelihood. Journal of the Royal Statistical Society, Series B, 52, 443-452. Gu, C. (1992). Cross-validating non-gaussian data. Journal of Computational and Graphical Statistics, 1, 169-179. Gu, C., Xiang, D. (2001). Cross-validating non-gaussian data: Generalized approximate cross-validation revisited. Journal of Computational and Graphical Statistics, 10, 581591. Hoerl, A.E., Kennard, R.W. (1970). Ridge regression: biased estimation for nonorthogonal problems. Technometrics, 12, 55-67. Huber, P.J. (1979). Robust smoothing. In R.L. Launer and G.N. Wilkinson (Ed.) Robustness in Statistics (pp. 33-48). New York: Academic Press. Ibacache-Pulgar, G., Paula, G.A. (2011). Local influence for Student-t partially linear models. Computational Statistics & Data Analysis, 55, 1462-1478. Jamshidian, M. (1999). Adaptive robust regression by using a nonlinear regression program. Journal of Statistical Software, 4, (6): 1-25. Kent, J.T., Tyler, D.E., Vardi, Y. (1994). A curious likelihood identity for the multivariate tdistribution. Communications in Statistics: Simulation and Computation, 23, 441-453. Kim, C. (1996). Cook’s distance in splines smoothing. Statistics & Probability Letters, 31, 139-144. Kim, C., Park, B.U., Kim, W. (2002). Influence diagnostics in semiparametric regression models. Statistics & Probability Letters, 60, 49-58. Koenker, R., Ng, P., Portnoy, S. (1994). Quantile smoothing splines. Biometrika, 81, 673680. Lange, K., Sinsheimer, J.S. (1993). Normal/Independent distributions and their applications in robust regression. Journal of Computational and Graphical Statistics, 2, 175-198. Lange, K., Little, R.J.A., Taylor, J.M.G. (1989). Robust statistical modeling using the t distribution. Journal of the American Statistical Association, 84, 881-896. Lee, J.S., Cox, D.D. (2009). Robust smoothing: smoothing parameter selection and applications to fluorence spectroscopy. Computational Statistics & Data Analysis, 54, 31313143. Lee, T.C.M., Oh, H.S. (2007). Robust penalized regression spline fitting with application to additive mixed modeling. Computational Statistics, 22, 159-171. Leinhardt, S., Wasserman, S.S. (1979). Teaching regression: An exploratory approach. The American Statistician, 33, 196-203. Lin, T.I., Lee, J.C. (2006). A robust approach to t linear mixed models applied to multiple sclerosis data. Statistics in Medicine, 25, 1397-1412. Little, R.J.A. (1988). Robust estimation of the mean and covariance matrix from data with missing values. Applied Statistics, 37, 23-38. Lucas, A. (1997). Robustness of the Student t based M -estimator. Communications in Statistics: Theory and Methods, 26, 1165-1182. Manchester, L. (1996). Empirical influence for robust smoothing. Australian Journal of Statistics, 38, 275-290. Maronna, R.A. (1976). Robust M -estimators of multivariate location and scatter. The Annals of Statistics, 4, 51-67. Mateos, G., Giannakis, G.B. (2012). Robust nonparametric regression via sparsity control with application to load curve data cleansing. IEEE Transactions on Signal Processing, 60, 1571-1584. McLachlan, G.L., Krishnan, T. (1997). The EM Algorithm and Extensions. New York: Wiley. Meza, C., Osorio, F., De la Cruz, R. (2012). Estimation in nonlinear mixed-effects models using heavy-tailed distributions Statistics and Computing, 22, 121-139. Moore, R.J. (1982). Algorithm AS 187: Derivatives of the incomplete gamma integral. Applied Statistics, 31, 330-335. Oh, H.S., Brown, T., Charbonneau, P. (2004). Period analysis of variable stars by robust smoothing. Applied Statistics, 53, 15-30.

Influence diagnostics for robust P-splines

27

Oh, H.S., Lee, J., Kim, D. (2008). A recipe for robust estimation using pseudo data. Journal of the Korean Statistical Society, 37, 63-72. O’Sullivan, F., Yandell, B.S., Raynor, W.J. (1986). Automatic smoothing of regression functions in generalized linear models. Journal of the American Statistical Association, 81, 96-103. Osorio, F. (2014). heavy: Package for robust estimation using heavy-tailed distributions. R package version 0.2-35. URL: CRAN.R-project.org/package=heavy. Phillips, R.F. (2002). Least absolute deviations estimation via the EM algorithm. Statistics and Computing, 12, 281-285. Pinheiro, J., Liu, C., Wu, Y. (2001). Efficient algorithms for robust estimation in linear mixed-effects models using the multivariate t distribution. Journal of Computational and Graphical Statistics, 10, 249-276. Poon, W., Poon, Y.S. (1999). Conformal normal curvature and assessment of local influence. Journal of the Royal Statistical Society, Series B, 61, 51-61. R Core Team (2014). R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. URL: http://www.R-project.org. Ruppert, D. (2002). Selecting the number of knots for penalized splines. Journal of Computational and Graphical Statistics, 11, 735-757. Ruppert, D., Wand, M.P., Carroll, R.J. (2003). Semiparametric Regression. Cambridge: Cambridge University Press. Shi, L., Wang, X. (1999). Local influence in ridge regression. Computational Statistics & Data Analysis, 31, 341-353. Silverman, B.W. (1985). Some aspects of the spline smoothing approach to non-parametric regression curve fitting (with discussion). Journal of the Royal Statistical Society, Series B, 47, 1-52. Staudenmayer, J., Lake, E.E., Wand, M.P. (2009). Robustness for general design mixed models using the t-distribution. Statistical Modelling, 9, 235-255. Tharmaratnam, K., Claeskens, G., Croux, C., Salibi´ an-Barrera, M. (2010). S-estimation for penalized regression splines. Journal of Computational and Graphical Statistics, 19, 609-625. Thomas, W. (1991). Influence diagnostics for the cross-validated smoothing parameter in spline smoothing. Journal of the American Statistical Association, 86, 693-698. Utreras, F.I. (1981). On computing robust splines and applications. SIAM Journal on Scientific and Statistical Computing, 2, 153-163. Walker, E., Birch, J.B. (1988). Influence measures in ridge regression. Technometrics, 30, 221-227. Wei, W.H. (2004). Derivatives diagnostics and robustness for smoothing splines. Computational Statistics & Data Analysis, 46, 335-356. Wei, W.H. (2005). The smoothing parameter, confidence interval and robustness for smoothing splines. Journal of Nonparametric Statistics, 17, 613-642. Wei, B.C., Shih, J.Q. (1994). On statistical models for regression diagnostics. Annals of the Institute of Statistical Mathematics, 46, 267-278. Wei, B.C., Hu, Y.Q., Fung, W.K. (1998). Generalized leverage and its applications. Scandinavian Journal of Statistics, 25, 25-37. Xiang, D., Wahba, G. (1996). A generalized approximate cross validation for smoothing splines with non-gaussian data. Statistica Sinica, 6, 675-692. Zhu, H., Lee, S.Y. (2001). Local influence for incomplete-data models. Journal of the Royal Statistical Society, Series B, 63, 111-126. Zhu, H., Lee, S.Y., Wei, B.C., Zhou, J. (2001). Case-deletion measures for models with incomplete data. Biometrika, 88, 727-737.

INFLUENCE DIAGNOSTICS FOR ROBUST P-SPLINES USING SCALE MIXTURE OF NORMAL DISTRIBUTIONS FELIPE OSORIO

Supplementary Material Appendix A presents details about the computational implementation. Proofs of Propositions 1 to 5 are established in Appendix B. Additional results obtained from a real data analysis and a Monte Carlo simulation study are presented in the Appendices C and D, respectively. Appendix A. Notes about the computational implementation Below we describe the computational strategy adopted in the heavy R package that uses the estimation procedure proposed in this work. A set of routines has been written in C in order to accelerate some of the calculations. The implementation is fairly simple and most of the cases has been possible to draw upon the routines from the BLAS library (Lawson et al., 1979), Linpack (Dongarra et al., 1979) and Mathlib, included in the R software (R Core Team, 2014). The M step of the algorithm, described in Equations (9) and (10) from Section 1/2 2.1 can be efficiently computed by considering U = W (k) B and decompose U using a singular value decomposition like U DV > , where U is n × p matrix such that U > U = I p , D = diag(δ1 , . . . , δp ) and V is an orthogonal p × p matrix. It is important to note that the form of decomposition highlights that it is possible to overwrite the matrix U in order to keep storage to a minimum. A second singular value decomposition is then calculated: KV D −1 = QRS > , where only R = diag(r1 , . . . , rp ) and the orthogonal p × p matrix S are stored. Let 1/2

Z = U > W (k)

Y and c = S > Z. Thus, we define (k+1)

αλ

= S(I + λR2 )−1 c,

λ > 0,

(A.1)

and proceed to calculate the fitted values and residuals for the current fit as (k+1)



(k+1)

= U αλ

(k+1)

and eλ

(k+1)

= Z − gλ

,

respectively. Hence, Equation (10) adopts the form (k+1)

φλ

=

1 (k+1) 2 {keλ k + λkR(I + λR2 )−1 ck2 }. n

The author was supported by grants CONICYT 791100007 and FONDECYT 1140580. 1

(A.2)

2

FELIPE OSORIO

Using the results above, it is possible to rewrite the WGCV criterion defined in (14) from Section 2.2 for the smoothing parameter selection as (k+1)

V (λ) =

k2 1 keλ , n (1 − edf/n)2

(A.3)

where edf represents the effective number of parameters, defined as edf = tr H W (λ), which can be calculated efficiently as p X 1 . edf = tr(I + λR2 )−1 = 1 + λ rj2 j=1 We use the Brent’s method (Brent, 1973) to carry out the minimization of V (λ) given in (A.3). Finally, once convergences of the algorithm was reached, we take b λ = V D −1 α b λ, a

c bλ = W g

−1/2

b λ. Uα

The estimation procedure according to the algorithm described in this section is available in the heavyPS function of heavy package. Appendix B. Proofs of main results b and d2 V (λ, ω) b d2 Qλ (θ, ω|θ) In this appendix we derive the differentials d2θ Qλ (θ|θ), ωθ ωθ b ∆(ω) ¨ θ), for scale and response perturbation schemes. The necessary matrices Q(θ| and the vector ∂ 2 V (λ, ω)/∂λ∂ω > are obtained efficiently using the differentiation method and by applying some identification theorems discussed in Magnus and Neudecker (1999). Proof of Proposition 1. The complete data penalized log-likelihood function given in Equation (6) from Section 2.1 is b = − n log φ − 1 [(Y − Ba)> W c (Y − Ba) + λa> K > Ka], Qλ (θ|θ) 2 2φ b with respect to a, we obtain Differentiating Qλ (θ|θ) 1 c B − λa> K > K} da, {(Y − Ba)> W φ b = − 1 (da)> {B > W c B + λK > K} da, d2a Qλ (θ|θ) φ b with respect to φ leads to and the differential of da Qλ (θ|θ) b = − 1 dφ{(Y − Ba)> W c B − λa> K > K} da. d2φa Qλ (θ|θ) φ2 b with respect to φ are given by In addition, the differentials of Qλ (θ|θ) b = da Qλ (θ|θ)

b = − n dφ + 1 {(Y − Ba)> W c (Y − Ba) + λa> K > Ka} dφ, dφ Qλ (θ|θ) 2φ 2φ2 b = n d2 φ − 1 {(Y − Ba)> W c (Y − Ba) + λa> K > Ka} d2 φ. d2φ Qλ (θ|θ) 2φ2 φ3 From the first-order conditions c Y − (B > W c B + λK > K)b B>W a = 0, (B.1) b ) = 0, nφb − (RSSW a) + λb a> K > K a c (b

(B.2)

INFLUENCE DIAGNOSTICS FOR ROBUST P-SPLINES

3

follow immediately that b b = − 1 (da)> {B > W c B + λK > K} da, d2a Qλ (θ|θ) θ=θ b φ 2 b b b = − n d2 φ. dφa Qλ (θ|θ) θ=θb = 0, d2φ Qλ (θ|θ) θ=θ 2φb2 Applying the second identification theorem given in Magnus and Neudecker (1999) b θ), b and the proof is complete. ¨ θ| we obtain Q(  Proof of Proposition 2. The Q-function for the perturbated model introduced in Equation (18) from Subsection 3.2.1 assumes the form b =− Qλ (θ, ω|θ)

1 n c 1/2 diag(ω)W c 1/2 (Y −Ba)+λa> K > Ka], log φ− [(Y −Ba)> W 2 2φ

where ω is an n-dimensional perturbation vector, the perturbated model reduces to the postulated model when ω 0 = 1. b with respect to θ = (a> , φ)> , we get Taking differentials of the Qλ (θ, ω|θ) b = 1 {(Y − Ba)> W c diag(ω)B − λa> K > K} da, da Qλ (θ, ω|θ) φ b = − n dφ + 1 {(Y − Ba)> W c 1/2 diag(ω)W c 1/2 (Y − Ba) dφ Qλ (θ, ω|θ) 2φ 2φ2 + λa> K > Ka} dφ, using that z > diag(ω) = ω > diag(z) and diag(ω) diag(z) = diag(z) diag(ω) for z b and dφ Qλ (θ, ω|θ) b can be an n-dimensional vector, the differentials da Qλ (θ, ω|θ) written as b = 1 {ω > diag()W c B − λa> K > K} da, da Qλ (θ, ω|θ) φ b = − n dφ + 1 {ω > diag()W c  + λa> K > Ka} dφ, dφ Qλ (θ, ω|θ) 2φ 2φ2 b and dφ Qλ (θ, ω|θ) b where  = Y −Ba. To find ∆(ω), we differentiate da Qλ (θ, ω|θ) with respect to ω, thus 1 c B da, (dω)> diag()W φ b = 1 (dω)> diag()W c  dφ. d2ωφ Qλ (θ, ω|θ) 2φ2 b = d2ωa Qλ (θ, ω|θ)

(B.3) (B.4)

Applying the second identification theorem given in Magnus and Neudecker (1999) b and ω = ω 0 we obtain ∆(ω 0 ), and the and evaluating (B.3) and (B.4) at θ = θ proof is complete.  Proof of Proposition 3. Under the response perturbation Y (ω) = Y + ω, the Qfunction for the perturbated model is given by b =− Qλ (θ, ω|θ)

n 1 c (Y (ω) − Ba) + λa> K > Ka], log φ − [(Y (ω) − Ba)> W 2 2φ

where ω is an n-dimensional perturbation vector and the vector of null perturbation vector is given by ω 0 = 0.

4

FELIPE OSORIO

b with respect to θ = (a> , φ)> is The first differential of Qλ (θ, ω|θ) b = 1 {(Y (ω) − Ba)> W c B − λa> K > K} da, da Qλ (θ, ω|θ) φ b = − n dφ + 1 {(Y (ω) − Ba)> W c (Y (ω) − Ba) + λa> K > Ka} dφ, dφ Qλ (θ, ω|θ) 2φ 2φ2 b and dφ Qλ (θ, ω|θ) b with respect to ω we taking the differential of da Qλ (θ, ω|θ) have b = 1 (dω)> W c B da, d2ωa Qλ (θ, ω|θ) (B.5) φ b = 1 (dω)> W c (Y (ω) − Ba) dφ. (B.6) d2ωφ Qλ (θ, ω|θ) 2φ2 The ∆(ω 0 ) matrix can be obtained by applying the second identification theorem b and ω = (Magnus and Neudecker, 1999) and evaluating (B.5) and (B.6) at θ = θ ω 0 . Thus, the proposition is verified.  Proof of Proposition 4. The weighted cross-validation criterion under the scale perturbation defined by Equation (19) from Subsection 3.2.3 is given by V (λ, ω) =

RSSW c (λ, ω)/n , 2 {tr(I − H W c (λ, ω))/n}

where > > c 1/2 c 1/2 (I − H c (λ, ω))Y , diag(ω)W RSSW c (λ, ω)) W c (λ, ω) = Y (I − H W W

and >c > −1 > c B W diag(ω). HW c (λ, ω) = B(B W diag(ω)B + λK K) Next, we shall write RSSW (λ, ω) = RSS(λ, ω) and H c c (λ, ω) = H(λ, ω). Let W edf(λ, ω) = tr H(λ, ω) be the effective number of parameters for the perturbed model. Taking the differential of V (λ, ω) with respect to ω leads to n 1/n dω V (λ, ω) = (1 − edf(λ, ω)/n)2 dω RSS(λ, ω) (1 − edf(λ, ω)/n)4 o − RSS(λ, ω) dω (1 − edf(λ, ω)/n)2 , (B.7)

is easy to see that, c diag(dω)BS −1 B > W c diag(ω) + BS −1 B > W c diag(dω) dω H(λ, ω) = −BS −1 B > W c diag(dω)(I − H(λ, ω)) = BS −1 B > W >c

(B.8)

>

where S = B W diag(ω)B + λK K. Thus, using the properties of the trace operator, we obtain 2 c ) dω. dω (1−edf(λ, ω)/n)2 = − (1−edf(λ, ω)/n)×1> dg((I −H(λ, ω))BS −1 B > W n (B.9) Let eω = (I − H(λ, ω))Y , direct calculations show that c 1/2 diag(ω)W c 1/2 eω dω RSS(λ, ω) = −Y > (dω H(λ, ω))> W c 1/2 diag(ω)W c 1/2 (dω H(λ, ω))Y − e> ωW c 1/2 diag(dω)W c 1/2 eω . + e> ωW

(B.10)

INFLUENCE DIAGNOSTICS FOR ROBUST P-SPLINES

5

Substituting equation (B.8) into (B.10), and using some simple algebra we obtain >c dω RSS(λ, ω) = e> ω (I − 2H(λ, ω)) W diag(eω ) dω.

(B.11)

Therefore, using Equations (B.9) and (B.10) the first differential of V (λ, ω) with respect to ω given in Equation (B.7) can now be written as n 1/n >c (1 − edf(λ, ω)/n)e> dω V (λ, ω) = ω (I − 2H(λ, ω)) W diag(eω ) (1 − edf(λ, ω)/n)3 o 2 c ) dω. + RSS(λ, ω)1> dg((I − H(λ, ω))BS −1 B > W (B.12) n Applying the first identification theorem by Magnus and Neudecker (1999) and b and ω = ω 0 leads to ∂V (λ, b ω 0 )/∂ω given in Equation evaluating (B.12) at λ = λ (20) from Subsection 3.2.3 Using matrix differentiation, we have c diag(ω) dλ, dλ H(λ, ω) = −GW

(B.13)

where G = BS −1 K > KS −1 B > . Thus, 3 c diag(ω)) dλ, dλ (1 − edf(λ, ω)/n)−3 = − (1 − edf(λ, ω)/n)−4 tr(GW n

(B.14)

and c diag(ω)Y dλ. dλ eω = −(dλ H(λ, ω))Y = GW

(B.15)

Therefore, taking the differential of RSS(λ, ω) with respect to λ we obtain c GW c diag(ω)eω dλ dλ RSS(λ, ω) = 2 Y > diag(ω)W

(B.16)

Using Equations (B.12)-(B.16), and after some algebra yields to the differential of dω V (λ, ω) with respect to λ which is given by 3 c diag(ω)) dλ dω V (λ, ω) tr(GW 1 − edf(λ, ω)/n n1 1/n c diag(ω))e> (I − 2H(λ, ω))> W c diag(eω ) + tr(GW ω (1 − edf(λ, ω)/n)3 n  c G(I − 2H(λ, ω))> W c diag(eω ) + (1 − edf(λ, ω)/n) Y > diag(ω)W

d2λω V (λ, ω) = −

 > >c c c c + 2e> ω diag(ω)W GW diag(eω ) + eω (I − 2H(λ, ω)) W diag(GW diag(ω)Y ) 4 c GW c diag(ω)eω 1> dg((I − H(λ, ω))BS −1 B > W c) + Y > diag(ω)W n 2 −1 > c c + RSS(λ, ω)1> B W n dg(GW diag(ω)BS n o c ) dλ dω, − (I − H(λ, ω))GW (B.17) where the differential dω V (λ, ω) is given in Equation (B.12) and dg(Z) = diag(z11 , . . . , znn ) = I Z for Z = (zij ) a square matrix of order n × n with being the b and ω = ω 0 and applyhadamard product. Thus, evaluating (B.17) at λ = λ ing the second identification theorem by Magnus and Neudecker (1999) we obtain ∂ 2 V (λ, ω 0 )/∂λ∂ω > |λ=λb and the proof is complete. 

6

FELIPE OSORIO

Proof of Proposition 5. Under the response perturbation scheme, Y (ω) = Y + ω, where ω = (ω1 , . . . , ωn )> with ω 0 = 0, the perturbed WGCV criterion assumes the form RSSW c (λ, ω)/n , V (λ, ω) = 2 {tr(I − H W c (λ))/n} where > >c RSSW c (λ, ω) = Y (ω)(I − H W c (λ)) W (I − H W c (λ))Y (ω), and >c > −1 > c HW B W. c (λ) = B(B W B + λK K)

In order to simplify the notation we will write RSSW c (λ, ω) = RSS(λ, ω) and HW c (λ) = H(λ). The first differential of V (λ, ω) with respect to ω assumes the form dω RSS(λ, ω) 1 , dω V (λ, ω) = n {1 − tr(H(λ))/n}2 where c (I − H(λ)) dω. dω RSS(λ, ω) = 2 Y > (ω)(I − H(λ))> W Let edf(λ) = tr H(λ) be the effective number of parameters, thus taking the differential of dω V (λ, ω) with respect to λ yields d2λω V (λ, ω) =

2 c (I − H(λ)) dω dλ (1 − edf(λ, ω)/n)−2 Y > (ω)(I − H(λ))> W n 2/n c (I − H(λ))] dω, + Y > (ω) dλ [(I − H(λ))> W (1 − edf(λ, ω)/n)2 (B.18)

c B + λK > K, and using that Let S = B > W c = −GW c dλ, dλ H(λ) = −BS −1 (dλ S)S −1 B > W where G = BS −1 K > KS −1 B > , leads to c (I − H(λ)) = −(dλ H(λ))> W c (I − H(λ)) − (I − H(λ))> W c dλ H(λ) dλ (I − H(λ))> W c GW c (I − H(λ)) + (I − H(λ))> W c GW c ) dλ, = (W (B.19) therefore 2/n 2/n c ) dλ. tr(dλ H(λ)) = − tr(GW 3 (1 − edf(λ)/n) (1 − edf(λ)/n)3 (B.20) Substituting equations (B.19) and (B.20) into (B.18) yields n 2/n c GW c (I − H(λ)) + (I − H(λ))> W c GW c) d2λω V (λ, ω) = Y > (ω)(W (1 − edf(λ)/n)2 o 2/n c )Y > (ω)(I − H(λ))> W c (I − H(λ)) dλ dω. − tr(GW (1 − edf(λ)/n) (B.21) dλ (1−edf(λ)/n)−2 =

b and ω = ω 0 and applying the second identification Evaluating (B.21) at λ = λ theorem by Magnus and Neudecker (1999), we obtain ∂ 2 V (λ, ω 0 )/∂λ∂ω > |λ=λb and the Proposition 5 is verified. 

INFLUENCE DIAGNOSTICS FOR ROBUST P-SPLINES

7

Appendix C. Real data example We consider the balloon dataset reported by Davies and Gather (1993) and previously analyzed by Kovac and Silverman (2000), Lee and Oh (2007) and Tharmaratnam et al. (2010). The data are radiation measurements from the sun, taken during the flight of a weather balloon. Due to the rotation of the balloon, or for some other reasons, a large amount of outliers were introduced because the measuring instrument was occasionally blocked from the sun. The dataset is included in the R package ftnonpar available from CRAN repository. The sample size equals 4984. Following the model settings from Tharmaratnam et al. (2010), we applied P-splines using B-splines of third-degree and second-order of penalty, with 35 knots spread equally. The model was fitted using the routine heavyPS from the R package heavy, also avaliable at CRAN. All the computations were done on an IBM x3650 M4 Server with 2 Intel Xeon E5-2670 processors and 128 GB of RAM Quadro 6000. Table 1. Estimation summary for the balloon dataset under three fitted models. Model Normal Slash Student-t

νb

b λ

b Y obs ) `λ (θ;

edf

WGCV

iterations

— 0.4306 0.9712

0.3284 0.0081 0.0230

2470.705 7503.993 7513.882

27.1735 30.2584 30.3645

0.0220 0.0001 0.0003

3 101 84

time (sec.) 0.060 3.529 3.799

As expected, the estimation procedure based on heavy-tailed distributions produces a curve estimate that is not affected by the outlying observations (see Figure 1 and Table 1). On the other hand, the curve estimate obtained under the normality assumption suffers from the presence of the outliers. That is, the whole estimated curve was pulled downwardly away from the majority of the observations. Another interesting feature of the proposed estimation procedure is that the implementation is fairly simple and ensure a reasonable computational speed (Compare with the numerical experiments reported by Staudenmayer et al., 2009). We conducted an additional experiment, considering a sequence of values for the number of knots. Figures 2 to 7 show the fitted curves for the balloon dataset considering 15, 20, 25, 30, 35 and 40 knots. Table 2 presents the estimation results for this dataset under normal, slash and Student-t errors. It is interesting to note that for any number of knots the fitted curve under gaussian errors is strongly affected by the outlying observations, however the fitted curve obtained using heavy-tailed distributions do not suffer from this phenomenon. An unexpected finding is that the number of knots have an extremely large influence on the fitted curve under the normality assumption, but this influence is reduced when we consider distributions with heavier tails than the normal ones Appendix D. Simulation study A small simulation study was conducted to illustrate the practical performance of the proposed estimation procedure under a severe contamination scheme. The design of the simulation study is based on the used by Tharmaratnam et al. (2010) (see also Cantoni and Ronchetti, 2001; Lee and Oh, 2007). We have used the following test function Yi = sin(2π(1 − xi )2 ) + i ,

i = 1, . . . , n.

FELIPE OSORIO

● ● ● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●●● ● ●● ●● ●● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ●● ●●● ●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ●● ●● ● ● ●● ● ●●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ●●●● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ●● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ● ●

0.5

1.0

y

1.5

2.0

2.5

8

● ● ● ● ● ● ● ● ●●

0.0

0.2

0.4

0.6

0.8

1.0

x

Figure 1. Fitted curve for the balloon dataset under three distributional assumptions: (−) normal, (- · -) Student-t and (- -) slash models. Table 2. Summary for the balloon dataset under three fitted models, considering different number of knots.

2408.03 2419.76 2439.09 2427.79 2470.71 2515.78

0.000 0.000 0.000 0.000 0.008 0.001

νb

b λ

νb

b Y obs ) `λ (θ;

0.499 0.481 0.432 0.437 0.431 0.434

6721.26 6945.77 7349.22 7382.80 7504.00 7554.08

0.000 0.001 0.000 0.001 0.023 0.001

1.119 1.076 0.966 0.976 0.971 0.967

6742.48 6965.06 7375.85 7406.13 7513.88 7573.12

● ●

● ●

● ● ● ● ● ● ●●

0.0

0.2

0.4

0.6 x

0.8

● ● ● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●●●● ●●●● ● ● ●● ● ● ● ● ●●● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ●●●● ●● ● ●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ●● ●●● ● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

0.5

2.0 1.5 y

● ● ● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●●●● ●●●● ● ● ●● ● ● ● ● ●●● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ●●●● ●● ● ●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ●● ●●● ● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

0.5

1.0

slash 2.5

2.5

Student-t ● ●

● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●●●● ●●●● ● ● ●● ● ● ● ● ●●● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ●●●● ●● ● ●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ●● ●●● ● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

0.5

1.0

y

1.5

2.0

2.5

normal

2.0

0.010 0.011 0.018 0.657 0.328 0.089

Student-t

b Y obs ) `λ (θ;

1.5

15 20 25 30 35 40

slash b λ

y

of knots

normal b Y obs ) `λ (θ;

b λ

1.0

Number

● ●

● ● ● ● ● ● ●●

1.0

0.0

0.2

0.4

0.6 x

0.8

● ● ● ● ● ● ●●

1.0

0.0

0.2

0.4

0.6 x

Figure 2. Fitted curve for the normal, Student-t and slash models using 15 knots.

0.8

1.0

INFLUENCE DIAGNOSTICS FOR ROBUST P-SPLINES

● ●

2.0 1.5 y 1.0

● ●

● ● ● ● ● ● ●●

0.0

0.2

0.4

0.6

0.8

● ● ● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ●● ● ●●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ●● ●●● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

0.5

2.0 1.5 y

● ● ● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ●● ● ●●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ●● ●●● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

0.5

1.0

slash 2.5

2.5

Student-t ● ●

● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ●● ● ●●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ●● ●●● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

0.5

1.0

y

1.5

2.0

2.5

normal

9

● ●

● ● ● ● ● ● ●●

1.0

0.0

0.2

0.4

x

0.6

0.8

● ● ● ● ● ● ●●

1.0

0.0

0.2

0.4

x

0.6

0.8

1.0

x

Figure 3. Fitted curve for the normal, Student-t and slash models using 20 knots.

● ●

2.0 1.5 y 1.0

● ●

● ● ● ● ● ● ●●

0.0

0.2

0.4

0.6

0.8

● ● ● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ●● ● ●●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ●● ●●● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●●● ● ● ●● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

0.5

2.0 1.5 y

● ● ● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ●● ● ●●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ●● ●●● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●●● ● ● ●● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

0.5

1.0

slash 2.5

2.5

Student-t ● ●

● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ●● ● ●●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ●● ●●● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●●● ● ● ●● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

0.5

1.0

y

1.5

2.0

2.5

normal

● ●

● ● ● ● ● ● ●●

1.0

0.0

0.2

0.4

x

0.6

0.8

● ● ● ● ● ● ●●

1.0

0.0

0.2

0.4

x

0.6

0.8

1.0

x

Figure 4. Fitted curve for the normal, Student-t and slash models using 25 knots.

● ●

0.2

0.4

0.6 x

0.8

2.5 2.0 1.5 y

● ●

● ● ● ● ● ● ●●

0.0

● ● ● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ●● ● ●●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ●● ●●● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●●● ● ● ●● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

0.5

2.0 1.5 y 0.5

1.0

slash ● ●

● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ●● ● ●●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ●● ●●● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●●● ● ● ●● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

1.0

2.5

Student-t ● ●

● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ●● ● ●●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ●● ●●● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ●●● ● ● ●● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

0.5

1.0

y

1.5

2.0

2.5

normal

● ●

● ● ● ● ● ● ●●

1.0

0.0

0.2

0.4

0.6 x

0.8

● ● ● ● ● ● ●●

1.0

0.0

0.2

0.4

0.6

0.8

1.0

x

Figure 5. Fitted curve for the normal, Student-t and slash models using 30 knots. We generated M = 1000 datasets of a sample size n = 100 from the model above with design points x1 , . . . , xn generated independently from the uniform distribution U(−1, 1), the random disturbances {i } were generated from the contaminated normal distribution (1 − δ)N (0, 0.72 ) + δN (20, 22 ), for δ = 0%, 5%, 10%, 25% and 40% of outliers. We applied P-splines considering B-splines of third-degree and second-order of penalty, with 25 knots spread equally according to the quantiles of the data. The normal, slash and Student-t distributions were considered, fixing the degrees of freedom for the slash and Student-t

10

FELIPE OSORIO

● ●

2.5 2.0 1.5 y 1.0

● ●

● ● ● ● ● ● ●●

0.0

0.2

0.4

0.6

● ● ● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ●● ● ●●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ●● ●●● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

0.5

2.0 1.5 y 1.0

slash ● ●

● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ●● ● ●●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ●● ●●● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

0.5

2.0 1.5 y

● ● ● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ●● ● ● ● ●● ● ●●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ●● ●●● ● ●● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

0.5

1.0

2.5

Student-t

2.5

normal

● ●

● ● ● ● ● ● ●●

0.8

1.0

0.0

0.2

0.4

x

0.6

0.8

● ● ● ● ● ● ●●

1.0

0.0

0.2

0.4

x

0.6

0.8

1.0

x

Figure 6. Fitted curve for the normal, Student-t and slash models using 35 knots. normal

0.2

0.4

0.6

0.8

2.5 2.0 1.5 y

● ● ● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●●●● ●●●● ● ● ●● ● ● ● ● ●●● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ●● ●● ● ●● ●● ● ●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ●●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ●

0.5

● ●

● ●

● ● ● ● ● ● ●●

0.0

1.0

2.5 2.0 1.5 y 1.0

slash ● ●

● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●●●● ●●●● ● ● ●● ● ● ● ● ●●● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ●● ●● ● ●● ●● ● ●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ●●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ●

0.5

0.5

1.0

y

1.5

2.0

2.5

Student-t ● ●

● ● ● ● ●●● ● ●● ● ● ● ●● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ●● ●● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ●● ●● ● ●● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ●●●● ● ● ●● ● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ●● ●● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●●●●● ●●●● ● ● ●● ● ● ● ● ●●● ●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ●● ●● ●● ● ●● ●● ● ●● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ●●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●●●● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●●● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●● ●● ●● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ●● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ●● ●●● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ●●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●● ●● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ●● ● ● ● ● ● ● ● ●

● ●

● ● ● ● ● ● ●●

1.0

x

0.0

0.2

0.4

0.6

0.8

● ● ● ● ● ● ●●

1.0

0.0

0.2

x

0.4

0.6

0.8

1.0

x

Figure 7. Fitted curve for the normal, Student-t and slash models using 40 knots. at 1/2, 1 and 2, and 1, 2, and 4, respectively. The smoothing parameter is chosen by minimizing the weighted generalized cross-validation (WGCV) criterion defined in Section 2.2. Figure 8 displays two typical datasets together with the fitted curve from the gaussian and Student-t models (fitted curve for the slash model is similar to the obtained under Student-t errors and is not shown here), unlike the P-splines considering the assumption of normality the penalized spline estimator under Student-t errors with 1 degrees of freedom remains close to the true regression function (solid line) for both situations, without and in presence of outliers. To assess the performance of the proposed procedure we compute the mean squared error (MSE) for each simulated dataset j (j = 1, . . . , M ), as n

M SEj =

1X (g(xi ) − gbj (xi ))2 , n i=1

j = 1, . . . , M,

with M = 1000. We ran our simulation experiment on an IBM x3650 M4 Server with 2 Intel Xeon E5-2670 processors and 128 GB of RAM Quadro 6000. The total computation time was 16 hours, 56 minutes and 5.13 seconds. Figures 9 to 11 present the boxplots of M SE in the logarithmic scale for several contamination percentages considering the P-splines estimation under normal, Student-t and slash distributions. From these plots it is clear that when we are in presence of outliers, the proposed procedure produces better estimates. In fact our findings are very similar to those reported by Tharmaratnam et al. (2010). However, it should be noted that the penalized spline estimator under heavy-tailed

INFLUENCE DIAGNOSTICS FOR ROBUST P-SPLINES

25

(b)

25

(a)

11

● ●







20



● ●

● ●



●●

● ●





● ● ●

10

y

15



5

5

10

y

15

20

● ●

−1.0

−0.5



● ●●● ● ●● ● ● ● ● ●● ● ● ●● ●● ●● ● ● ●● ● ●●● ● ● ● ●● ● ● ●●● ● ● ●● ● ●

● ●● ●



0.0

0.5

● ●

0

0

● ● ●● ● ●● ● ●●● ● ●● ● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ● ● ●● ● ● ●● ● ● ●●● ●● ●● ● ● ● ●

1.0

● ●●● ● ●● ●● ● ● ● ● ● ● ●● ●● ● ● ● ●●

● ●

−1.0

−0.5

● ● ● ● ● ●●





0.0

x

● ●●● ● ●● ● ● ● ●● ● ● ● ●● ● ● ●● ● ●● ● ●● ● ●● ●

● ● ● ●

0.5

1.0

x

Figure 8. Fitted curve for the Cantoni and Ronchetti (2001) test function (a) without outliers and (b) with 25% of outliers from N (20, 22 ). True function sin(2π(1 − x)2 ) (solid line), fitted curve using normal errors (penalized LS-estimation) (dotted) and assuming Student-t model with ν = 1 (dashed). 6

slash

6

Student-t

● ● ● ●



2

4





● ●

10%

25%

40%

−2

● ●

−4



−4 5%

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●



−4

0%



● ● ● ●

● ● ●

● ●



● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

−2

● ● ● ● ● ●

● ● ●

● ●

● ● ●



● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

0

● ● ●

0

0

● ● ● ● ● ●

−2



log(MSE)

log(MSE)

2

● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

2

4

4



log(MSE)

6

normal

0%

5%

10%

25%

40%

0%

5%

10%

25%

40%

Figure 9. Boxplot of MSE using normal, Student-t (ν = 1) and slash (ν = 21 ) models considering several contamination levels for the Cantoni and Ronchetti (2001) test function. normal

6

slash

6

6

Student-t ●



● ●





● ● ●

10%

25%

40%

−4

● ●

−4

−4



5%

● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ●

−2

● ● ● ● ●

0%

● ● ● ● ● ● ●

● ● ● ● ● ●

● ● ● ● ● ● ●

−2

−2

4 ● ● ●

● ● ●

● ●



2

2

● ● ● ● ● ● ● ● ● ● ● ● ●

0

● ● ● ● ● ● ● ● ● ● ● ●

0

● ● ● ● ● ● ● ● ● ● ● ●

log(MSE)

log(MSE)

2

● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

log(MSE)

● ● ● ●

0

4

4



0%

5%

10%

25%

40%

0%

5%

10%

25%

Figure 10. Boxplot of MSE using normal, Student-t (ν = 2) and slash (ν = 1) models considering several contamination levels for the Cantoni and Ronchetti (2001) test function.

40%

12

FELIPE OSORIO

● ●

● ● ● ● ● ● ●

● ●

● ● ● ● ● ●

● ● ● ● ● ● ● ● ●

● ●

2

● ● ● ● ●

● ●

0

0

● ● ● ● ● ● ●



log(MSE)

2 log(MSE)

log(MSE)

2

● ● ● ●

0

● ● ● ●

4

4

4

6

slash

6

Student-t

6

normal

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●



● ● ● ● ●

−2

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

−2

−2



● ● ●

0%

5%

10%

25%

40%

● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

−4

● ●

−4

−4

● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

0%

5%

10%

25%

40%

0%

5%

10%

25%

40%

Figure 11. Boxplot of MSE using normal, Student-t (ν = 4) and slash (ν = 2) models considering several contamination levels for the Cantoni and Ronchetti (2001) test function.

distributions remains stable as the proportion of contamination increases only for small (and fixed) degrees of freedom. In our simulation experiment the penalized spline estimator’s MSEs grows rapidly after 10% of outliers for degrees of freedom as small as 4 or 2, for the Student-t or slash distributions, respectively. In fact, as is discussed by Lange et al. (1989) modeling with distributions with heavier tails than the normal one is not the panacea for all robustness problems. In particular we can expect difficulties in P-splines under scale mixtures of normal distributions for data with extreme outliers.

References Brent, R.P. (1973). Algorithms for Minimization Without Derivatives. Dover, New York. Cantoni, E., Ronchetti, E. (2001). Resistant selection of the smoothing parameter for smoothing splines. Statistics and Computing 11, 141-146. Davies, L., Gather, U. (1993). The identification of multiple outliers (with discussion). Journal of the American Statistical Association 88, 782-801. Dongarra, J.J., Bunch, J.R., Moler, C.B., Stewart, G.W. (1979). Linpack Users Guide. SIAM Publications, Philadelphia. Kovac, A., Silverman, B.W. (2000). Extending the scope of wavelet regression methods by coefficient-dependent thresholding. Journal of the American Statistical Association 95, 172-183. Lange, K., Little, R.J.A., Taylor, J.M.G. (1989). Robust statistical modeling using the t distribution. Journal of the American Statistical Association 84, 881-896. Lawson, C.L., Hanson, R.J., Kincaid, D., Krogh, F.T. (1979). Basic linear algebra subprograms for FORTRAN usage. ACM Transactions on Mathematical Software 5, 308-323. Lee, T.C.M., Oh, H.S. (2007) Robust penalized regression spline fitting with application to additive mixed modeling. Computational Statistics 22, 159-171. Magnus, J.R., Neudecker, H. (1999). Matrix Differential Calculus with Applications in Statistics and Econometrics. Wiley, Chichester. R Core Team (2014). R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria. http://www.R-project.org Ruppert, D. (2002) Selecting the number of knots for penalized splines. Journal of Computational and Graphical Statistics 11, 735-757. Staudenmayer, J., Lake, E.E., Wand, M.P. (2009). Robustness for general design mixed models using the t-distribution. Statistical Modelling 9, 235-255.

INFLUENCE DIAGNOSTICS FOR ROBUST P-SPLINES

13

´ n-Barrera, M. (2010). STharmaratnam, K., Claeskens, G., Croux, C., Salibia estimation for penalized regression splines. Journal of Computational and Graphical Statistics 19, 609-625. ´ tica, Universidad Te ´cnica Federico Santa Mar´ıa, Chile Departamento de Matema Current address: Departamento de Matem´ atica, Universidad T´ ecnica Federico Santa Mar´ıa, Av. Espa˜ na 1680, Casilla 110-V, Valpara´ıso, Chile E-mail address: [email protected]

Suggest Documents