Numerical Contributions to the Asymptotic Theory of

0 downloads 0 Views 872KB Size Report
Normal approximation: (mp(1 − p) ≥ 9). Binom (m,p) ≈ N (mp,mp(1 − p)). Matthias Kohl. Numerical Contributions to the Asymptotic Theory of Robustness ...
Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Numerical Contributions to the Asymptotic Theory of Robustness Matthias Kohl

Fakultät für Mathematik und Physik Universität Bayreuth

Promotionskolloquium 15.12.2005

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Outline

1

Asymptotic Theory of Robustness – an Abridge Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

2

Supplements to the Asymptotic Theory of Robustness Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Outline

1

Asymptotic Theory of Robustness – an Abridge Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

2

Supplements to the Asymptotic Theory of Robustness Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Outline

1

Asymptotic Theory of Robustness – an Abridge Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

2

Supplements to the Asymptotic Theory of Robustness Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Ideal Model parametric family of probability measures Θ ⊂ Rk (open)

P = {Pθ | θ ∈ Θ}

smoothly parameterized; i.e., L2 differentiable at θ ∈ Θ with L2 derivative Λθ ∈ Lk2 (Pθ ), Eθ Λθ = 0 and Fisher information of full rank Iθ = Eθ Λθ Λτθ

Matthias Kohl

Iθ  0

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Ideal Model parametric family of probability measures Θ ⊂ Rk (open)

P = {Pθ | θ ∈ Θ}

smoothly parameterized; i.e., L2 differentiable at θ ∈ Θ with L2 derivative Λθ ∈ Lk2 (Pθ ), Eθ Λθ = 0 and Fisher information of full rank Iθ = Eθ Λθ Λτθ

Matthias Kohl

Iθ  0

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Ideal Model parametric family of probability measures Θ ⊂ Rk (open)

P = {Pθ | θ ∈ Θ}

smoothly parameterized; i.e., L2 differentiable at θ ∈ Θ with L2 derivative Λθ ∈ Lk2 (Pθ ), Eθ Λθ = 0 and Fisher information of full rank Iθ = Eθ Λθ Λτθ

Matthias Kohl

Iθ  0

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Ideal Model in S4 Classes

L2ParamFamily

ProbFamily name : character distribution : Distribution

ParamFamily

L2deriv : EuclRandVarList L2derivSymm : FunSymmList

param : ParamFamParameter

distrSymm : DistributionSymmetry

L2derivDistr : DistrList L2derivDistrSymm : DistrSymmList

props : character

FisherInfo : PosDefSymmMatrix

meta-information slot props semi-symbolic calculus for symmetry properties generating functions for various L2 -families Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Ideal Model in S4 Classes

L2ParamFamily

ProbFamily name : character distribution : Distribution

ParamFamily

L2deriv : EuclRandVarList L2derivSymm : FunSymmList

param : ParamFamParameter

distrSymm : DistributionSymmetry

L2derivDistr : DistrList L2derivDistrSymm : DistrSymmList

props : character

FisherInfo : PosDefSymmMatrix

meta-information slot props semi-symbolic calculus for symmetry properties generating functions for various L2 -families Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Ideal Model in S4 Classes

L2ParamFamily

ProbFamily name : character distribution : Distribution

ParamFamily

L2deriv : EuclRandVarList L2derivSymm : FunSymmList

param : ParamFamParameter

distrSymm : DistributionSymmetry

L2derivDistr : DistrList L2derivDistrSymm : DistrSymmList

props : character

FisherInfo : PosDefSymmMatrix

meta-information slot props semi-symbolic calculus for symmetry properties generating functions for various L2 -families Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Ideal Model in S4 Classes

L2ParamFamily

ProbFamily name : character distribution : Distribution

ParamFamily

L2deriv : EuclRandVarList L2derivSymm : FunSymmList

param : ParamFamParameter

distrSymm : DistributionSymmetry

L2derivDistr : DistrList L2derivDistrSymm : DistrSymmList

props : character

FisherInfo : PosDefSymmMatrix

meta-information slot props semi-symbolic calculus for symmetry properties generating functions for various L2 -families Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Influence Curves (ICs) and AL Estimators Definition The set Ψ2 (θ) of all square integrable ICs at Pθ is  Ψ2 (θ) = ψθ ∈ Lk2 (Pθ ) | E θ ψθ = 0, E θ ψθ Λτθ = Ik Definition An asymptotic estimator Sn : (Ωn , An ) → (Rk , Bk ) is called asymptotically linear at Pθ if there is an IC ψθ ∈ Ψ2 (θ) with √

n 1 X n (Sn − θ) = √ ψθ (yi ) + oPθn (n0 ) n i=1

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Influence Curves (ICs) and AL Estimators Definition The set Ψ2 (θ) of all square integrable ICs at Pθ is  Ψ2 (θ) = ψθ ∈ Lk2 (Pθ ) | E θ ψθ = 0, E θ ψθ Λτθ = Ik Definition An asymptotic estimator Sn : (Ωn , An ) → (Rk , Bk ) is called asymptotically linear at Pθ if there is an IC ψθ ∈ Ψ2 (θ) with √

n 1 X n (Sn − θ) = √ ψθ (yi ) + oPθn (n0 ) n i=1

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Influence Curves in S4 Classes

InfluenceCurve name : character Curve : EuclRandVarList

IC CallL2Fam : call

Risks : list Infos : matrix

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Influence Curves in S4 Classes

TotalVarIC neighborRadius : numeric clipLo : numeric InfluenceCurve name : character

clipUp : numeric stand : matrix

IC

Curve : EuclRandVarList Risks : list

CallL2Fam : call

Infos : matrix

ContIC neighborRadius : numeric clip : numeric cent : numeric stand : matrix lowerCase : OptionalNumeric

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Outline

1

Asymptotic Theory of Robustness – an Abridge Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

2

Supplements to the Asymptotic Theory of Robustness Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Neighborhoods Convex contamination neighborhood of radius r ∈ [0, ∞)  Uc (θ, r ) = (1 − r )+ Pθ + (1 ∧ r ) Q Q ∈ M1 (A) √

n ≥ −r infPθ q, are defined as   r dQn (q, r ) = 1 + √ q dPθ n

Simple perturbations for

where  Gc (θ) = q ∈ L∞ (Pθ ) E θ q = 0, infPθ q ≥ −1

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Neighborhoods Convex contamination neighborhood of radius r ∈ [0, ∞)  Uc (θ, r ) = (1 − r )+ Pθ + (1 ∧ r ) Q Q ∈ M1 (A) √

n ≥ −r infPθ q, are defined as   r dQn (q, r ) = 1 + √ q dPθ n

Simple perturbations for

where  Gc (θ) = q ∈ L∞ (Pθ ) E θ q = 0, infPθ q ≥ −1

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Neighborhoods in S4 Classes

TotalVarNeighborhood Neighborhood type : character

UncondNeighborhood

radius : numeric

ContNeighborhood

may be of fixed size or of shrinking radius; i.e., r

Matthias Kohl

√ r/ n

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Neighborhoods in S4 Classes

TotalVarNeighborhood Neighborhood type : character

UncondNeighborhood

radius : numeric

ContNeighborhood

may be of fixed size or of shrinking radius; i.e., r

Matthias Kohl

√ r/ n

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Neighborhoods in S4 Classes

TotalVarNeighborhood Neighborhood type : character

UncondNeighborhood

radius : numeric

ContNeighborhood

may be of fixed size or of shrinking radius; i.e., r

Matthias Kohl

√ r/ n

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Robust Models in S4 Classes

InfRobModel center : L2ParamFamily neighbor : UncondNeighborhood

RobModel center : ProbFamily neighbor : Neighborhood

FixRobModel center : ParamFamily neighbor : UncondNeighborhood

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Outline

1

Asymptotic Theory of Robustness – an Abridge Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

2

Supplements to the Asymptotic Theory of Robustness Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Asymptotic Mean Square Error Problem

Choosing quadratic loss, one obtains for fixed r ∈ (0, ∞) maxMSE θ (ηθ , r ) = Eθ |ηθ |2 + r 2 ωc,θ (ηθ )2 = min ! with ηθ ∈ ΨD 2 (θ) and ωc,θ (ηθ ) = supPθ |ηθ | by Proposition 5.3.3 (a) of Rieder (1994).

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Risks in S4 Classes

RiskType type : character

fiRisk

asRisk fiHampel

fiCov

bound : numeric

trFiCov fiMSE

fiBias

asHampel asCov trAsCov

bound : numeric asGRisk

asBias

fiUnOvShoot width : numeric asMSE

asUnOvShoot width : numeric

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Unique MSE Solution

Theorem 5.5.7 (b), Rieder (1994) η˜θ = (Aθ Λθ − aθ )w

 w = min 1,

bθ |Aθ Λθ − aθ |



with Lagrange multipliers Aθ , aθ and bθ determined by 0 = E θ (Λθ − zθ )w

aθ = Aθ zθ

Ik = Aθ E θ (Λθ − zθ )(Λθ − zθ )τ w  r 2 bθ = E θ |Aθ Λθ − aθ | − bθ +

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Outline

1

Asymptotic Theory of Robustness – an Abridge Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

2

Supplements to the Asymptotic Theory of Robustness Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Properties of MSE Solution I Classical Cramér-Rao bound: Cov (ηθ )  Cov (ψˆθ ) = Iθ−1

where ψˆθ = Iθ−1 Λθ

Hence MSE (ηθ ) = tr Cov (ηθ ) ≥ tr Cov (ψˆθ ) = tr Iθ−1 = MSE (ψˆθ ) Generalized by Proposition 2.1.1, Kohl (2005): maxMSE(ηθ , r ) ≥ maxMSE(˜ ηθ , r ) = tr Aθ

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Properties of MSE Solution I Classical Cramér-Rao bound: Cov (ηθ )  Cov (ψˆθ ) = Iθ−1

where ψˆθ = Iθ−1 Λθ

Hence MSE (ηθ ) = tr Cov (ηθ ) ≥ tr Cov (ψˆθ ) = tr Iθ−1 = MSE (ψˆθ ) Generalized by Proposition 2.1.1, Kohl (2005): maxMSE(ηθ , r ) ≥ maxMSE(˜ ηθ , r ) = tr Aθ

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Properties of MSE Solution II

Lagrange multipliers are bounded (Kohl (2005)) Lagrange multipliers are not necessarily unique (Rieder (1994), Kohl (2005)) Lagrange multipliers are continuous w.r.t. radius r ∈ (0, ∞) (Kohl (2005)) Lagrange multipliers are continuous w.r.t. parameter θ ∈ Θ (Kohl (2005))

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Properties of MSE Solution II

Lagrange multipliers are bounded (Kohl (2005)) Lagrange multipliers are not necessarily unique (Rieder (1994), Kohl (2005)) Lagrange multipliers are continuous w.r.t. radius r ∈ (0, ∞) (Kohl (2005)) Lagrange multipliers are continuous w.r.t. parameter θ ∈ Θ (Kohl (2005))

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Properties of MSE Solution II

Lagrange multipliers are bounded (Kohl (2005)) Lagrange multipliers are not necessarily unique (Rieder (1994), Kohl (2005)) Lagrange multipliers are continuous w.r.t. radius r ∈ (0, ∞) (Kohl (2005)) Lagrange multipliers are continuous w.r.t. parameter θ ∈ Θ (Kohl (2005))

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Properties of MSE Solution II

Lagrange multipliers are bounded (Kohl (2005)) Lagrange multipliers are not necessarily unique (Rieder (1994), Kohl (2005)) Lagrange multipliers are continuous w.r.t. radius r ∈ (0, ∞) (Kohl (2005)) Lagrange multipliers are continuous w.r.t. parameter θ ∈ Θ (Kohl (2005))

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Outline

1

Asymptotic Theory of Robustness – an Abridge Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

2

Supplements to the Asymptotic Theory of Robustness Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

MSE–Inefficiency Definition The MSE–inefficiency of η˜r0 w.r.t. η˜r is defined as relMSE(˜ ηr0 , r ) =

maxMSE(˜ ηr0 , r ) maxMSE(˜ ηr , r )

where maxMSE(˜ ηr0 , r ) = E |˜ ηr0 |2 + r 2 ω∗ (˜ ηr0 )2

∗ = c, v

Remark The MSE–inefficiency was first considered and numerically evaluated in Rieder et al. (2001). Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

MSE–Inefficiency Definition The MSE–inefficiency of η˜r0 w.r.t. η˜r is defined as relMSE(˜ ηr0 , r ) =

maxMSE(˜ ηr0 , r ) maxMSE(˜ ηr , r )

where maxMSE(˜ ηr0 , r ) = E |˜ ηr0 |2 + r 2 ω∗ (˜ ηr0 )2

∗ = c, v

Remark The MSE–inefficiency was first considered and numerically evaluated in Rieder et al. (2001). Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Maximum MSE–Inefficiency MSE−Inefficiency in case of Normal Location and Scale 3.089

3.0 r = 0.140 r = 0.231 r = 0.396 r = 0.579 r=∞

MSE−inefficiency

2.5

2.0 1.76

2.219

1.756 1.599

1.5 1.31 1.2 1.1 1.0

1.314 1.200 1.100 1.050 0.05

1.314

1.000 0.1

0.25

0.5

1.0

2.0

5.0

neighborhood radius

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Outline

1

Asymptotic Theory of Robustness – an Abridge Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

2

Supplements to the Asymptotic Theory of Robustness Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

One-Step Estimator The one-step estimator S = (Sn ) is defined as n

Sn = θˆn +

1X ψn,θˆn (y1 ,...,yn ) (yi ) n i=1

where θˆn is an appropriate initial estimate. works in case of Exponential families of full rank (cf. Lemma 2.3.6, Kohl (2005)) initial estimator: Kolmogorov(-Smirnov) minimum distance estimator (cf. Theorem 6.3.7, Rieder (1994))

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

One-Step Estimator The one-step estimator S = (Sn ) is defined as n

Sn = θˆn +

1X ψn,θˆn (y1 ,...,yn ) (yi ) n i=1

where θˆn is an appropriate initial estimate. works in case of Exponential families of full rank (cf. Lemma 2.3.6, Kohl (2005)) initial estimator: Kolmogorov(-Smirnov) minimum distance estimator (cf. Theorem 6.3.7, Rieder (1994))

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

One-Step Estimator The one-step estimator S = (Sn ) is defined as n

Sn = θˆn +

1X ψn,θˆn (y1 ,...,yn ) (yi ) n i=1

where θˆn is an appropriate initial estimate. works in case of Exponential families of full rank (cf. Lemma 2.3.6, Kohl (2005)) initial estimator: Kolmogorov(-Smirnov) minimum distance estimator (cf. Theorem 6.3.7, Rieder (1994))

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Optimally Robust Estimation – a Proposal 0. Choose an appropriate parametric family. 1. Choose and evaluate an appropriate initial estimate; e.g., Kolmogorov(–Smirnov) MD estimator. 2. Depending on the quality of the data, try to find a rough estimate for the amount ε ∈ [0, 1] of gross errors such that ε ∈ [ε, ε]. 3. Estimate the parameter of interest by means of the corresponding radius-minimax estimator using the one-step construction. Via our R package ROptEst this proposal so far works for all(!) L2 -differentiable parametric families which are based on a univariate distribution. Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Optimally Robust Estimation – a Proposal 0. Choose an appropriate parametric family. 1. Choose and evaluate an appropriate initial estimate; e.g., Kolmogorov(–Smirnov) MD estimator. 2. Depending on the quality of the data, try to find a rough estimate for the amount ε ∈ [0, 1] of gross errors such that ε ∈ [ε, ε]. 3. Estimate the parameter of interest by means of the corresponding radius-minimax estimator using the one-step construction. Via our R package ROptEst this proposal so far works for all(!) L2 -differentiable parametric families which are based on a univariate distribution. Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Optimally Robust Estimation – a Proposal 0. Choose an appropriate parametric family. 1. Choose and evaluate an appropriate initial estimate; e.g., Kolmogorov(–Smirnov) MD estimator. 2. Depending on the quality of the data, try to find a rough estimate for the amount ε ∈ [0, 1] of gross errors such that ε ∈ [ε, ε]. 3. Estimate the parameter of interest by means of the corresponding radius-minimax estimator using the one-step construction. Via our R package ROptEst this proposal so far works for all(!) L2 -differentiable parametric families which are based on a univariate distribution. Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Optimally Robust Estimation – a Proposal 0. Choose an appropriate parametric family. 1. Choose and evaluate an appropriate initial estimate; e.g., Kolmogorov(–Smirnov) MD estimator. 2. Depending on the quality of the data, try to find a rough estimate for the amount ε ∈ [0, 1] of gross errors such that ε ∈ [ε, ε]. 3. Estimate the parameter of interest by means of the corresponding radius-minimax estimator using the one-step construction. Via our R package ROptEst this proposal so far works for all(!) L2 -differentiable parametric families which are based on a univariate distribution. Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Optimally Robust Estimation – a Proposal 0. Choose an appropriate parametric family. 1. Choose and evaluate an appropriate initial estimate; e.g., Kolmogorov(–Smirnov) MD estimator. 2. Depending on the quality of the data, try to find a rough estimate for the amount ε ∈ [0, 1] of gross errors such that ε ∈ [ε, ε]. 3. Estimate the parameter of interest by means of the corresponding radius-minimax estimator using the one-step construction. Via our R package ROptEst this proposal so far works for all(!) L2 -differentiable parametric families which are based on a univariate distribution. Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Normal Location and Scale Ideal Model:  P = Pθ = N (µ, σ 2 ) θ = (µ, σ)τ ∈ R × (0, ∞) L2 derivative and Fisher information at θ = (µ, σ)τ :     1 1 (y − µ)/σ 1 0 Λθ (y ) = Iθ = 2 0 2 (y − µ)2 /σ 2 − 1 σ σ Invariant under the group of transformations gθ (u) = σu + µ i.e., Pθ = gθ (Pθ0 ) where θ0 = (0, 1)τ Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Normal Location and Scale Ideal Model:  P = Pθ = N (µ, σ 2 ) θ = (µ, σ)τ ∈ R × (0, ∞) L2 derivative and Fisher information at θ = (µ, σ)τ :     1 1 (y − µ)/σ 1 0 Λθ (y ) = Iθ = 2 0 2 (y − µ)2 /σ 2 − 1 σ σ Invariant under the group of transformations gθ (u) = σu + µ i.e., Pθ = gθ (Pθ0 ) where θ0 = (0, 1)τ Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Normal Location and Scale Ideal Model:  P = Pθ = N (µ, σ 2 ) θ = (µ, σ)τ ∈ R × (0, ∞) L2 derivative and Fisher information at θ = (µ, σ)τ :     1 1 (y − µ)/σ 1 0 Λθ (y ) = Iθ = 2 0 2 (y − µ)2 /σ 2 − 1 σ σ Invariant under the group of transformations gθ (u) = σu + µ i.e., Pθ = gθ (Pθ0 ) where θ0 = (0, 1)τ

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Outline

1

Asymptotic Theory of Robustness – an Abridge Asymptotically Linear Estimators Infinitesimal Robust Setup Optimally Robust Influence Curves

2

Supplements to the Asymptotic Theory of Robustness Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Poisson and Normal Approximation of Binomial Distribution

Poisson approximation: Let λ = mp (p small, m large). Then, Binom (m, p) ≈ Pois (λ) Normal approximation: (mp(1 − p) ≥ 9)  Binom (m, p) ≈ N mp, mp(1 − p)

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Poisson and Normal Approximation of Binomial Distribution

Poisson approximation: Let λ = mp (p small, m large). Then, Binom (m, p) ≈ Pois (λ) Normal approximation: (mp(1 − p) ≥ 9)  Binom (m, p) ≈ N mp, mp(1 − p)

Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Setup Let Pν = {Pν,θ | θ ∈ Θ} ⊂ M1 (Aν )

(ν ∈ N0 )

be a sequence of L2 -differentiable parametric families. √ In addition, let rn := r / n and consider  Uc,ν (θ, rn ) = (1 − rn )+ Pν,θ + (1 ∧ rn ) Qν Qν ∈ M1 (Aν ) Question: Pν ≈ P0

or even

Matthias Kohl

Uc,ν (θ, rn ) ≈ Uc,0 (θ, rn )

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Setup Let Pν = {Pν,θ | θ ∈ Θ} ⊂ M1 (Aν )

(ν ∈ N0 )

be a sequence of L2 -differentiable parametric families. √ In addition, let rn := r / n and consider  Uc,ν (θ, rn ) = (1 − rn )+ Pν,θ + (1 ∧ rn ) Qν Qν ∈ M1 (Aν ) Question: Pν ≈ P0

or even

Matthias Kohl

Uc,ν (θ, rn ) ≈ Uc,0 (θ, rn )

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Setup Let Pν = {Pν,θ | θ ∈ Θ} ⊂ M1 (Aν )

(ν ∈ N0 )

be a sequence of L2 -differentiable parametric families. √ In addition, let rn := r / n and consider  Uc,ν (θ, rn ) = (1 − rn )+ Pν,θ + (1 ∧ rn ) Qν Qν ∈ M1 (Aν ) Question: Pν ≈ P0

or even

Matthias Kohl

Uc,ν (θ, rn ) ≈ Uc,0 (θ, rn )

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Convergence of Experiments Definition 2.2.1, Le Cam and Lo Yang (2000) The deficiency δ(Pν , P0 ) of Pν w.r.t. P0 is the smallest number δ ∈ [0, 1] such that for every arbitrary loss function W with 0 ≤ W ≤ 1 and every risk function r2 there is an risk function r1 such that r1 (Pν,θ , W ) ≤ r2 (P0,θ , W ) + δ for all θ ∈ Θ. Theorem 2.4.1, Kohl (2005) Assume the laws of the corresponding L2 derivatives as well as the trace of the corresponding Fisher information converge under suitable standardizations. Then, the suitable standardized maximum asymptotic MSE of the corresponding optimally robust estimators converges. Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Convergence of Experiments Definition 2.2.1, Le Cam and Lo Yang (2000) The deficiency δ(Pν , P0 ) of Pν w.r.t. P0 is the smallest number δ ∈ [0, 1] such that for every arbitrary loss function W with 0 ≤ W ≤ 1 and every risk function r2 there is an risk function r1 such that r1 (Pν,θ , W ) ≤ r2 (P0,θ , W ) + δ for all θ ∈ Θ. Theorem 2.4.1, Kohl (2005) Assume the laws of the corresponding L2 derivatives as well as the trace of the corresponding Fisher information converge under suitable standardizations. Then, the suitable standardized maximum asymptotic MSE of the corresponding optimally robust estimators converges. Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Approximation of ICs for r = 0.25 Binom(25, 0.25)

0.2

10

15

0.10 20

25

0

5

10

15

20

x

Binom(25, 0.5)

Binom(25, 0.99)

relMSE(P, B) = 1.046 relMSE(N, B) = 1 m*p(1−p) = 6.25

IC

−0.2

Binom Norm Pois

0

Binom Norm Pois

x

0.0

0.1

5

relMSE(P, B) = 1.012 relMSE(N, B) = 1.002 m*p(1−p) = 4.688

−0.15

Binom Norm Pois

0

IC

IC

m*p(1−p) = 1.188

5

10

15

20

25

x

Matthias Kohl

25

Binom Norm Pois

−0.25 −0.15 −0.05

0.00

relMSE(N, B) = 1.001

0.00

relMSE(P, B) = 1

−0.06

IC

0.04

0.08

Binom(25, 0.05)

relMSE(P, B) = 3.301 relMSE(N, B) = 1 m*p(1−p) = 0.248

0

5

10

15

20

25

x

Numerical Contributions to the Asymptotic Theory of Robustness

Asymptotic Theory of Robustness – an Abridge Supplements to the Asymptotic Theory of Robustness

Mean Square Error Solution Radius-Minimax Estimator One-Step Construction Convergence of Robust Models

Bibliography M. Kohl. Numerical Contributions to the Asymptotic Theory of Robustness. Dissertation, University of Bayreuth, 2005. L. Le Cam and G. Lo Yang. Asymptotics in Statistics. Springer, 2000. H. Rieder. Robust Asymptotic Statistics. Springer, 1994. H. Rieder, M. Kohl and P. Ruckdeschel. The Costs of not Knowing the Radius. Submitted. Appeared as discussion paper Nr. 81. SFB 373, Humboldt University, Berlin, 2001. Matthias Kohl

Numerical Contributions to the Asymptotic Theory of Robustness