Soft-Cascade Learning with Explicit Computation ...

44 downloads 0 Views 833KB Size Report
Mar 13, 2018 - Francisco Rodolfo Barbosa-Anda, Frédéric Lerasle, Cyril Briand and. Alhayat Ali Mekonnen. LAAS-CNRS, Université de Toulouse, CNRS, UPS ...
Soft-Cascade Learning with Explicit Computation Time Considerations IEEE Winter Conf. on Applications of Computer Vision

Francisco Rodolfo Barbosa-Anda, Frédéric Lerasle, Cyril Briand and Alhayat Ali Mekonnen LAAS-CNRS, Université de Toulouse, CNRS, UPS, Toulouse, France

March 13, 2018

Table of contents

1

Introduction and Motivations

2

A novel iterative search procedure for large scale problems

3

Evaluations

4

Conclusion and Contributions

1 / 10

Introduction and Motivations Soft-cascade

Samples

Weak Classifier 1

Weak Classifer 2

h1

Weight α1

S1

hL

S1 ≥ θ1 Weight α2

+

S2

···

S2 ≥ θ2

S1

Threshold θ1

Yes

Weak Classifier L

h2

S2

Threshold θ2

Weight αL

+

SL

SL ≥ θL

SL−1

Threshold θL

No

Figure: Soft-Cascade Detector

Existing threshold tuning strategies BIP-based approach of Barbosa-Anda et al. [1], Direct Backward Pruning (DBP) of Zhang and Viola [6], WaldBoost of Sochman and Matas [4], “soft-cascade” of Bourdev and Brandt [2], boosting chain of Xiao et al. [5], fixed vector of Dollár et al. [3]. Introduction and Motivations

2 / 10

Search Space

2

2

1.5

1.5 1

0

−0.5

1

2

3

4

Threshold

Score

1 0.5

0.5 0

−0.5

−1

−1

−1.5

−1.5

−2

Weak Classifier

(a) A score tree.

−2

1

2

3

4

Weak Classifier

(b) Its threshold graph.

A novel iterative search procedure for large scale problems

3 / 10

Graph local search

Algorithm 1 Graph local search (GLS)

2 1.5

Threshold

1 0.5 0

−0.5

1

2

3

4

−1 −1.5 −2

Weak Classifier

Figure: A neighborhood in the threshold graph.

Require: tpr (Θ0 ) ≥ TPR and δmax ≥ 1 α←1 better 1 ← true while better1 do δ←1 better 2 ← false Θα,0 ← Θα−1 while not better 2 and δ ≤ δmax do Θα,δ ← BIP (TΘα ,δ−1 , TPR) better 2 ← objfunc (Θα,δ ) < objfunc (Θα,δ−1 ) δ ←δ+1 end while Θα ← Θα,δ−1 better 2 ← objfunc (Θα ) < objfunc (Θα−1 ) α←α+1 end while return Θα−1

A novel iterative search procedure for large scale problems

4 / 10

A cascade reduction procedure Algorithm 2 Iterative search procedure

2 1.5

Threshold

1 0.5 0

−0.5

1

2

3

4

−1 −1.5 −2

Weak Classifier

Figure: A reduction in the threshold graph.

Require: tpr (Θ0 ) ≥ TPR and δmax ≥ 1 better ← true β←1 L0 ← L while better and lTPR < L0 do L0 ← lTPR 0 ΘLβ−1 ← {θ1 , . . . , θL0 } ∈ Θβ−1  0  0 L Θβ ← GLS ΘLβ−1 , TPR, δmax for all l|1 ≤ l ≤ L do if l ≤ L0 then 0 0 θl ∈ Θβ ← θlL ∈ ΘLβ else θl ∈ Θβ ← min{n|S 0 >θL0 ∈ΘL0 ,yn =1} Sn,l n,L

end if end for better ← objfunc (Θβ−1 ) β ←β+1 end while return Θβ−1

L0

β

objfunc (Θβ )

Suggest Documents