Jun 1, 2012 - D Focus on (nonconvex) polynomial programming problems. D Aim to ..... D Degree of freedom: choice of basic/nonbasic partition. B,N of [n] ...
Compact relaxations for polynomial programming problems S. Cafieri (ENAC Toulouse, F) P. Hansen (GERAD & HEC, CA) L. L´etocart (LIPN Paris13, F) L. Liberti (LIX, Ecole Polytechnique, F) F. Messine (ENSEEIHT Toulouse, F) June 1, 2012
Compact relaxations
SEA 2012 – 1 / 40
⊲ Introduction The setting Exact reformulations ReformulationLinearization Technique RLT literature review Reduced RLT RRLT literature review
Introduction
New developments Why bother? Thank you
Compact relaxations
SEA 2012 – 2 / 40
The setting Introduction The setting Exact reformulations
ReformulationLinearization Technique
⊲
Focus on (nonconvex) polynomial programming problems Aim to solve with sBB Need a tight convex (linear) relaxation at each node Reformulate before relaxing
RLT literature review Reduced RLT RRLT literature review New developments Why bother? Thank you
Compact relaxations
SEA 2012 – 3 / 40
Exact reformulations P Introduction The setting Exact reformulations
Q
⊲
ReformulationLinearization Technique RLT literature review
G
G
Reduced RLT RRLT literature review
F
New developments
L
φ|G φ|L
Why bother? Thank you
L F
φ
Compact relaxations
P harder than Q find optima in Q, map them back to P for each opt. in P ∃ corresponding opt. in Q
SEA 2012 – 4 / 40
Introduction ReformulationLinearization Technique Aim RLT constraints 1/2 RLT constraints 2/2 Linearization Relaxation 1/2 Relaxation 2/2
⊲
RLT literature review
Reformulation-Linearization Technique
Reduced RLT RRLT literature review New developments Why bother? Thank you
Compact relaxations
SEA 2012 – 5 / 40
Aim Introduction ReformulationLinearization Technique Aim RLT constraints 1/2 RLT constraints 2/2 Linearization Relaxation 1/2 Relaxation 2/2
⊲
RLT literature review Reduced RLT RRLT literature review
Reformulation-Linearization Technique (RLT): tightens the linear relaxation of mixed-integer (nonconvex) QCQPs min c0 x + xQ0 x x∈Rn1 ×Zn2 ∀1 ≤ i ≤ q ci x + xQi x ≤ 0 ∀1 ≤ i ≤ m ai x ≤ bi L U x ∈ [x , x ].
New developments Why bother? Thank you
Compact relaxations
SEA 2012 – 6 / 40
RLT constraints 1/2 Introduction ReformulationLinearization Technique Aim RLT constraints 1/2 RLT constraints 2/2 Linearization Relaxation 1/2 Relaxation 2/2
∀j, ℓ ≤ n = n1 + n2 , i ≤ m, all factors non-negative ⇒ constraints are valid
⊲
RLT literature review
L )(x − x (xj − xL ℓ j ℓ) ≥ 0 U (xj − xL )(x j ℓ − xℓ ) ≥ 0 L (xU − x )(x − x j ℓ j ℓ) ≥ 0 U (xU j − xj )(xℓ − xℓ ) ≥ 0
Reduced RLT RRLT literature review
(xj − xL j )(bi − ai x) ≥ 0
New developments
(xU j − xj )(bi − ai x) ≥ 0
Why bother? Thank you
Compact relaxations
SEA 2012 – 7 / 40
RLT constraints 2/2 Introduction ReformulationLinearization Technique Aim RLT constraints 1/2 RLT constraints 2/2 Linearization Relaxation 1/2 Relaxation 2/2
∀j, ℓ ≤ n, i ≤ m, get:
⊲
RLT literature review Reduced RLT RRLT literature review New developments
L L L ≥ 0 x − x x + x xj xℓ − xL j ℓ j j xℓ ℓ L L U −xj xℓ + xU x + x x − x j j ℓ j xℓ ℓ
≥ 0
U U L ≥ 0 x + x x − x −xj xℓ + xL j ℓ j xℓ ℓ j U U U xj xℓ − xU ℓ xj − xj xℓ + xj xℓ
≥ 0
L −xj ai x + xL a x + x b − x i j i j j bi ≥ 0 U xj a i x − xU a x − x b + x i j i j j bi ≥ 0
Why bother? Thank you
Compact relaxations
SEA 2012 – 8 / 40
Linearization Introduction ReformulationLinearization Technique Aim RLT constraints 1/2 RLT constraints 2/2 Linearization Relaxation 1/2 Relaxation 2/2
⊲
Replace xj xℓ by var. wjℓ (wj = (wj1 , . . . , wjn )): (McCormick’s con vex/concave envelopes L L L L wjℓ ≥ xℓ xj + xj xℓ − xj xℓ for x x ) j ℓ U L L U wjℓ ≤ xℓ xj + xj xℓ − xj xℓ U L U wjℓ ≤ xL ℓ xj + xj xℓ − xj xℓ U U U U wjℓ ≥ xℓ xj + xj xℓ − xj xℓ 10
5
0
RLT literature review
–5
0
–10
Reduced RLT
–4
1
–2
0 y
2
4
2
x
RRLT literature review New developments Why bother? Thank you
Compact relaxations
ai wj ai wj
≤ ≥
L xL j ai x + xj bi − xj bi U xU j a i x + x j bi − x j bi .
valid linear relations between x and w
SEA 2012 – 9 / 40
Relaxation 1/2 Introduction ReformulationLinearization Technique Aim RLT constraints 1/2 RLT constraints 2/2 Linearization Relaxation 1/2 Relaxation 2/2
∀i ≤ q, replace terms xi xj in xQi x with linearizing variables wij w11 . . . w1n .. .. w = ... . . wn1 . . . wnn
Get xQi x = Qi w Use IA on [xL , xU ] to compute ranges [wL , wU ] for w Adjoin commutativity constr. ∀j ≤ ℓ ≤ n wjℓ = wℓj Adjoin square constr. ∀i ≤ n s.t. xi is binary wii = xi
⊲
RLT literature review Reduced RLT RRLT literature review
New developments Why bother? Thank you
Compact relaxations
SEA 2012 – 10 / 40
Relaxation 2/2 Introduction ReformulationLinearization Technique Aim RLT constraints 1/2 RLT constraints 2/2 Linearization Relaxation 1/2 Relaxation 2/2
Relaxed MILP
⊲
RLT literature review Reduced RLT
min c0 x + Q0 w ∀i ≤ q ci x + Qi w ≤ 0 ∀i ≤ m ai x ≤ bi (RLT+Commutativity+Square constraints) x ∈ (Rn1 × Zn2 ) ∩ [xL , xU ] 2 n w ∈ R ∩ [wL , wU ]
RRLT literature review New developments Why bother? Thank you
Compact relaxations
SEA 2012 – 11 / 40
Introduction ReformulationLinearization Technique RLT literature
⊲ review
The first paper Mixed products Relaxation of bilinear terms Mixed products again Relaxation hierarchy Etc.
RLT literature review
Reduced RLT RRLT literature review New developments Why bother? Thank you
Compact relaxations
SEA 2012 – 12 / 40
The first paper Introduction ReformulationLinearization Technique RLT literature review The first paper Mixed products Relaxation of bilinear terms Mixed products again Relaxation hierarchy Etc.
⊲
Seminal paper Adams & Sherali, Mgt. Sci. 1986: x ∈ {0, 1}n MILP reformulation via Fortet’s inequalities
Reduced RLT
wjℓ ≤ min(xj , xℓ )
RRLT literature review New developments
wjℓ ≥ max(0, xj + xℓ − 1)
Why bother? Thank you
then continuous relaxation
Compact relaxations
SEA 2012 – 13 / 40
Mixed products Introduction ReformulationLinearization Technique RLT literature review The first paper Mixed products Relaxation of bilinear terms Mixed products again Relaxation hierarchy Etc.
⊲
Adams & Sherali, Op. Res. 1990: x ∈ Rn1 × {0, 1}n2 with products involving at least one binary variable MILP reformulation
Reduced RLT RRLT literature review New developments Why bother?
then continuous relaxation
Thank you
Compact relaxations
SEA 2012 – 14 / 40
Relaxation of bilinear terms Introduction ReformulationLinearization Technique RLT literature review The first paper Mixed products Relaxation of bilinear terms Mixed products again Relaxation hierarchy Etc.
⊲
Sherali & Alameddine, JOGO 1992: x ∈ Rn with bilinear products LP relaxation
Reduced RLT RRLT literature review New developments
under special condition, LP is reformulation
Why bother? Thank you
Compact relaxations
SEA 2012 – 15 / 40
Mixed products again Introduction ReformulationLinearization Technique RLT literature review The first paper Mixed products Relaxation of bilinear terms Mixed products again Relaxation hierarchy Etc.
⊲
Reduced RLT RRLT literature review
Adams & Sherali, Math. Prog. 1993: x ∈ Rn1 × {0, 1}n2 specialized to bilinear products involving one continuous and one binary variable MILP reformulation
New developments Why bother?
then continuous relaxation
Thank you
Compact relaxations
SEA 2012 – 16 / 40
Relaxation hierarchy Sherali & Adams, DAM 1994: Introduction ReformulationLinearization Technique
x ∈ Rn1 × {0, 1}n2
RLT literature review The first paper Mixed products Relaxation of bilinear terms Mixed products again Relaxation hierarchy Etc.
Convex hull of MILP feasible region obtained through hierarchy of RLT constraints
⊲
Reduced RLT RRLT literature review New developments
Level d RLT: let J1 , J2 ⊆ {1, . . . , n2 } with |J1 ∪ J2 | = d and Y Y xj (1 − xj ); Fd (J1 , J2 ) = j∈J1
j∈J2
Why bother? Thank you
multiply RLT-(d − 1) constraints by all Fd (J1 , J2 ) the linearize to obtain RLT-d
Compact relaxations
SEA 2012 – 17 / 40
Etc. Introduction ReformulationLinearization Technique RLT literature review The first paper Mixed products Relaxation of bilinear terms Mixed products again Relaxation hierarchy Etc.
⊲
. . . and many more! extensions (polynomial and signomial programming, SDP, convex MINLP, disjunctive cuts) and applications
Reduced RLT RRLT literature review New developments Why bother? Thank you
Compact relaxations
SEA 2012 – 18 / 40
Introduction ReformulationLinearization Technique RLT literature review
⊲ Reduced RLT The quadratic case RRLT-1 constraints Main result Geometric interpretation
Reduced RLT
RRLT literature review New developments Why bother? Thank you
Compact relaxations
SEA 2012 – 19 / 40
The quadratic case Introduction ReformulationLinearization Technique RLT literature review Reduced RLT The quadratic case RRLT-1 constraints Main result Geometric interpretation
⊲
RRLT literature review
Consider mixed-integer QCQP subject to linear equality constraints Ax = b (A has full rank) c0 x + xQ0 x min n n x∈R 1 ×Z 2 ∀1 ≤ i ≤ q ci x + xQi x ≤ 0 Ax = b L U x ∈ X ∩ [x , x ].
X is a polyhedron
New developments Why bother? Thank you
Compact relaxations
SEA 2012 – 20 / 40
RRLT-1 constraints Introduction ReformulationLinearization Technique
Generate RLT constraints from Ax = b Ax = b
RLT literature review Reduced RLT The quadratic case RRLT-1 constraints Main result Geometric interpretation
∀ℓ ≤ n ∀ℓ ≤ n
xℓ Ax = xℓ b
⇒ ⇒
Awℓ = xℓ b
⊲
RRLT literature review New developments Why bother? Thank you
Compact relaxations
Consider homogeneous system ∀ℓ ≤ n (Awℓ = 0) and a set N of nonbasic variable index pairs (j, ℓ); let: C = {(x, w) | Ax = b ∧ ∀j, ℓ ≤ n (wjℓ = xj xℓ )} RN
= {(x, w) | Ax = b ∧ ∀ℓ ≤ n (Awℓ = bxℓ ) ∧ ∀(j, ℓ) ∈ N (wjℓ = xj xℓ )}
SEA 2012 – 21 / 40
Main result Introduction ReformulationLinearization Technique RLT literature review Reduced RLT The quadratic case RRLT-1 constraints Main result Geometric interpretation
⊲
RRLT literature review New developments Why bother? Thank you
Thm. Let [n] = {1, . . . , n}; ∃N ⊆ [n] × [n] C = RN Proof (RRLT system) ∀ℓ ≤ n Awℓ − xℓ b = 0 ⇒ (replace b by Ax) ∀ℓ ≤ n Awℓ − xℓ Ax = 0 ⇒ (z = wℓ − xℓ x are vars. of hom. sys.) ∀ℓ ≤ n A(wℓ − xℓ x) = 0.(1) get square nonsing. subsyst. (1) is homogeneous ′ A z = 0 of (1) N ⊆ [n] × [n]: nonbasic of (1) ⇒ corresponding to basic cols. ∀(j, ℓ) ∈ N wjℓ = xj xℓ B = [n] × [n] r N
by basic linear algebra, A′ z = 0 implies ∀(j, ℓ) ∈ B (wjℓ = xj xℓ ).
Cor. RRLT constraints ⇒ exact ref. with fewer quadratic terms Proof Only need quadratic terms indexed by N , RRLT implies those in B
Compact relaxations
SEA 2012 – 22 / 40
Geometric interpretation 10
10
8
Introduction ReformulationLinearization Technique
6 5
4 2 0
0 -2 -4
RLT literature review
-6
-5
-8 -10
Reduced RLT The quadratic case RRLT-1 constraints Main result Geometric interpretation
⊲
RRLT literature review New developments
0
-10 0
0.2 0.4 0.6 0.8
1 x 1.21.4 1.6 1.8
4
2
-4
-2
0 y
2
C = {(x, y, w) | x = 1 ∧ w = xy}
0.2 0.4 0.6 0.8
1 x 1.21.4 1.6 1.8
2
-4
-2
0 y
2
4
McCormick’s rel. of w = xy restricted to x = 1
10
5
Notice C = R (straight red segment)
0
Why bother?
Equation w = y can be obtained via RRLT: multiply equation x = 1 by y and linearize via w
-5
Thank you -10 0
0.2 0.4 0.6 0.8
1 x 1.21.4 1.6 1.8
4 2 -2 2
-4
0 y
R = {(x, y, w) | x = 1 ∧ w = y)}
Compact relaxations
SEA 2012 – 23 / 40
Introduction ReformulationLinearization Technique RLT literature review Reduced RLT RRLT literature
⊲ review
Announcement General theory Automatic reformulation Application to quantum chemistry
RRLT literature review
New developments Why bother? Thank you
Compact relaxations
SEA 2012 – 24 / 40
Announcement Introduction ReformulationLinearization Technique RLT literature review Reduced RLT RRLT literature review Announcement General theory Automatic reformulation Application to quantum chemistry
⊲
New developments
L. , ITOR 2004: RRLT constraints are linearly independent Preliminary results on pooling problems
Why bother? Thank you
Compact relaxations
SEA 2012 – 25 / 40
General theory Introduction ReformulationLinearization Technique RLT literature review Reduced RLT RRLT literature review Announcement General theory Automatic reformulation Application to quantum chemistry
⊲
New developments
L. , JOGO 2005: General theory of RRLT constraints Reformulation proofs
Why bother? Thank you
Compact relaxations
SEA 2012 – 26 / 40
Automatic reformulation Introduction ReformulationLinearization Technique RLT literature review Reduced RLT RRLT literature review Announcement General theory Automatic reformulation Application to quantum chemistry
⊲
New developments
L. & Pantelides, JOGO 2006: Graph-based automatic reformulation algorithm Full computational results on pooling and blending problems
Why bother? Thank you
Compact relaxations
SEA 2012 – 27 / 40
Application to quantum chemistry Introduction ReformulationLinearization Technique RLT literature review Reduced RLT RRLT literature review Announcement General theory Automatic reformulation Application to quantum chemistry
⊲
L. , Lavor, Maculan, Chaer Nascimento, DAM 2009: Application of an RRLT-2 subset to solving Hartree-Fock systems
New developments Why bother? Thank you
Compact relaxations
SEA 2012 – 28 / 40
Introduction ReformulationLinearization Technique RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
New developments
Why bother? Thank you
Compact relaxations
SEA 2012 – 29 / 40
Optimal RRLT Introduction ReformulationLinearization Technique
Feasible region of QCQP: use RN instead of C
RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 30 / 40
Optimal RRLT Introduction ReformulationLinearization Technique
Feasible region of QCQP: use RN instead of C RN relies on quadratic constraints ∀(j, ℓ) ∈ N (wjℓ = xj xℓ )
RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 30 / 40
Optimal RRLT Introduction ReformulationLinearization Technique RLT literature review
Feasible region of QCQP: use RN instead of C RN relies on quadratic constraints ∀(j, ℓ) ∈ N (wjℓ = xj xℓ ) Degree of freedom: choice of basic/nonbasic partition B, N of [n] × [n]
Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 30 / 40
Optimal RRLT Introduction ReformulationLinearization Technique RLT literature review Reduced RLT
Feasible region of QCQP: use RN instead of C RN relies on quadratic constraints ∀(j, ℓ) ∈ N (wjℓ = xj xℓ ) Degree of freedom: choice of basic/nonbasic partition B, N of [n] × [n] (j, ℓ) ↔ volume Vjℓ of conv. env. of xj xℓ
RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 30 / 40
Optimal RRLT Introduction ReformulationLinearization Technique RLT literature review
Reduced RLT
RRLT literature review
New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Feasible region of QCQP: use RN instead of C RN relies on quadratic constraints ∀(j, ℓ) ∈ N (wjℓ = xj xℓ ) Degree of freedom: choice of basic/nonbasic partition B, N of [n] × [n] (j, ℓ) ↔ volume Vjℓ of conv. P env. of xj xℓ Vjℓ Convexity gap: V(N ) = (j,ℓ)∈N
Why bother? Thank you
Compact relaxations
SEA 2012 – 30 / 40
Optimal RRLT Introduction ReformulationLinearization Technique RLT literature review
Reduced RLT
RRLT literature review
New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Feasible region of QCQP: use RN instead of C RN relies on quadratic constraints ∀(j, ℓ) ∈ N (wjℓ = xj xℓ ) Degree of freedom: choice of basic/nonbasic partition B, N of [n] × [n] (j, ℓ) ↔ volume Vjℓ of conv. P env. of xj xℓ Vjℓ Convexity gap: V(N ) = (j,ℓ)∈N
Let N ∗ = arg minN V(N )
Why bother? Thank you
Compact relaxations
SEA 2012 – 30 / 40
Optimal RRLT Introduction ReformulationLinearization Technique RLT literature review
Reduced RLT
RRLT literature review
New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Feasible region of QCQP: use RN instead of C RN relies on quadratic constraints ∀(j, ℓ) ∈ N (wjℓ = xj xℓ ) Degree of freedom: choice of basic/nonbasic partition B, N of [n] × [n] (j, ℓ) ↔ volume Vjℓ of conv. P env. of xj xℓ Vjℓ Convexity gap: V(N ) = (j,ℓ)∈N
Let N ∗ = arg minN V(N )
Smaller gap ⇒ tight bound more likely
Why bother? Thank you
Compact relaxations
SEA 2012 – 30 / 40
Optimal RRLT Introduction ReformulationLinearization Technique RLT literature review
Reduced RLT
RRLT literature review
New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Feasible region of QCQP: use RN instead of C RN relies on quadratic constraints ∀(j, ℓ) ∈ N (wjℓ = xj xℓ ) Degree of freedom: choice of basic/nonbasic partition B, N of [n] × [n] (j, ℓ) ↔ volume Vjℓ of conv. P env. of xj xℓ Vjℓ Convexity gap: V(N ) = (j,ℓ)∈N
Let N ∗ = arg minN V(N )
Smaller gap ⇒ tight bound more likely
B, N partition [n] × [n] ⇒ N ∗ = [n] × [n] r arg maxB V(B)
Why bother? Thank you
Compact relaxations
SEA 2012 – 30 / 40
Optimal RRLT Introduction ReformulationLinearization Technique RLT literature review
Reduced RLT
RRLT literature review
New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Feasible region of QCQP: use RN instead of C RN relies on quadratic constraints ∀(j, ℓ) ∈ N (wjℓ = xj xℓ ) Degree of freedom: choice of basic/nonbasic partition B, N of [n] × [n] (j, ℓ) ↔ volume Vjℓ of conv. P env. of xj xℓ Vjℓ Convexity gap: V(N ) = (j,ℓ)∈N
Let N ∗ = arg minN V(N )
Smaller gap ⇒ tight bound more likely
B, N partition [n] × [n] ⇒ N ∗ = [n] × [n] r arg maxB V(B) Reduces to finding a max-weight basis of a linear system
Why bother? Thank you
Compact relaxations
SEA 2012 – 30 / 40
Optimal RRLT Introduction ReformulationLinearization Technique RLT literature review
Reduced RLT
RRLT literature review
New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Feasible region of QCQP: use RN instead of C RN relies on quadratic constraints ∀(j, ℓ) ∈ N (wjℓ = xj xℓ ) Degree of freedom: choice of basic/nonbasic partition B, N of [n] × [n] (j, ℓ) ↔ volume Vjℓ of conv. P env. of xj xℓ Vjℓ Convexity gap: V(N ) = (j,ℓ)∈N
Let N ∗ = arg minN V(N )
Smaller gap ⇒ tight bound more likely
B, N partition [n] × [n] ⇒ N ∗ = [n] × [n] r arg maxB V(B) Reduces to finding a max-weight basis of a linear system Greedy algorithm solves problem optimally
Why bother? Thank you
Compact relaxations
SEA 2012 – 30 / 40
RRLT for polynomial programming 1/3 Introduction ReformulationLinearization Technique
Consider general polynomial programming MINLP g0 (x)
∀i ≤ q
gi (x) ≤ 0 Ax = b x ∈ X ∩ [xL , xU ]
x∈R
RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
min n
1 ×Zn2
where gi ∈ Q[x] for all i ∈ {0, . . . , q}
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 31 / 40
RRLT for polynomial programming 2/3 Introduction ReformulationLinearization Technique
Reformulation: for all J ⊆ [n − 1] multiply Ax = b by Q xj j∈J
RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 32 / 40
RRLT for polynomial programming 2/3 Introduction ReformulationLinearization Technique
RLT literature review
Reformulation: for all J ⊆ [n − 1] multiply Ax = b by Q xj j∈J Q Linearization: replace each term xj by the added j∈J
variable wJ (for all J ⊆ [n])
Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 32 / 40
RRLT for polynomial programming 2/3 Introduction ReformulationLinearization Technique
j∈J
RLT literature review Reduced RLT RRLT literature review
Reformulation: for all J ⊆ [n − 1] multiply Ax = b by Q xj j∈J Q Linearization: replace each term xj by the added
variable wJ (for all J ⊆ [n]) Q Adjoin defining constraints wJ = xj j∈J
New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 32 / 40
RRLT for polynomial programming 2/3 Introduction ReformulationLinearization Technique
j∈J
RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
Reformulation: for all J ⊆ [n − 1] multiply Ax = b by Q xj j∈J Q Linearization: replace each term xj by the added
variable wJ (for all J ⊆ [n]) Q Adjoin defining constraints wJ = xj j∈J
⊲
Define natural extensions of C, RN : C
= {(x, w) | Ax = b ∧ ∀J ⊆ [n − 1] (wJ =
Y
xj }
j∈J
RN
= {(x, w) | Ax = b ∧ ∀J ⊆ [n − 1] (AwJ = bwJ ) ∧ Y xj )} ∀J ∈ N (wJ = j∈J
where wJ = (w(J,1) , . . . , w(J,n) )
Why bother? Thank you
Compact relaxations
SEA 2012 – 32 / 40
RRLT for polynomial programming 2/3 Introduction ReformulationLinearization Technique
j∈J
RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
Reformulation: for all J ⊆ [n − 1] multiply Ax = b by Q xj j∈J Q Linearization: replace each term xj by the added
variable wJ (for all J ⊆ [n]) Q Adjoin defining constraints wJ = xj j∈J
⊲
Why bother? Thank you
Compact relaxations
Define natural extensions of C, RN : C
= {(x, w) | Ax = b ∧ ∀J ⊆ [n − 1] (wJ =
Y
xj }
j∈J
RN
= {(x, w) | Ax = b ∧ ∀J ⊆ [n − 1] (AwJ = bwJ ) ∧ Y xj )} ∀J ∈ N (wJ = j∈J
where wJ = (w(J,1) , . . . , w(J,n) ) Main result C = RN still holds SEA 2012 – 32 / 40
RRLT for polynomial programming 3/3 Introduction ReformulationLinearization Technique
Choice of optimal N extends from quadratic case, but:
RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 33 / 40
RRLT for polynomial programming 3/3 Introduction ReformulationLinearization Technique
Choice of optimal N extends from quadratic case, but: Added complication:
RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 33 / 40
RRLT for polynomial programming 3/3 Introduction ReformulationLinearization Technique
RLT literature review
Choice of optimal N extends from quadratic case, but: Added complication: –
Vij and Vijk are expressed in different units of measure
Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 33 / 40
RRLT for polynomial programming 3/3 Introduction ReformulationLinearization Technique
RLT literature review Reduced RLT RRLT literature review
Choice of optimal N extends from quadratic case, but: Added complication: – –
Vij and Vijk are expressed in different units of measure Summing up VJ for J’s of different sizes may not make much sense
New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 33 / 40
RRLT for polynomial programming 3/3 Introduction ReformulationLinearization Technique
Vij and Vijk are expressed in different units of measure Summing up VJ for J’s of different sizes may not make much sense P Multi-objective problem: ∀p ∈ [n] max VJ – –
RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
Choice of optimal N extends from quadratic case, but: Added complication:
|J|=p
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 33 / 40
RRLT for polynomial programming 3/3 Introduction ReformulationLinearization Technique
Vij and Vijk are expressed in different units of measure Summing up VJ for J’s of different sizes may not make much sense P Multi-objective problem: ∀p ∈ [n] max VJ – –
RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
Choice of optimal N extends from quadratic case, but: Added complication:
|J|=p
Thm. P Efficient solution is an optimum of max VJ J⊆[n]
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 33 / 40
RRLT for polynomial programming 3/3 Introduction ReformulationLinearization Technique
Reduced RLT RRLT literature review
⊲
Vij and Vijk are expressed in different units of measure Summing up VJ for J’s of different sizes may not make much sense P Multi-objective problem: ∀p ∈ [n] max VJ – –
RLT literature review
New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
Choice of optimal N extends from quadratic case, but: Added complication:
|J|=p
Thm. P Efficient solution is an optimum of max VJ J⊆[n]
Greedy is still OK
Why bother? Thank you
Compact relaxations
SEA 2012 – 33 / 40
Sparsity 1/4 Introduction ReformulationLinearization Technique
Polynomial programs are never dense in practice
RLT literature review
RRLT needs B ∪ N = P([n])
Need to introduce exponentially many new monomials
Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
?
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 34 / 40
Sparsity 2/4 Introduction
β =set of multi-indices for monomials already in problem
ReformulationLinearization Technique RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 35 / 40
Sparsity 2/4 Introduction ReformulationLinearization Technique
β =set of multi-indices for monomials already in problem Every new monomial J 6∈ β yields a new variable wJ
RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 35 / 40
Sparsity 2/4 Introduction ReformulationLinearization Technique
β =set of multi-indices for monomials already in problem Every new monomial J 6∈ β yields a new variable wJ Sometimes ∃J 6∈ β s.t. wJ yields > 1 new RRLT constr.
RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 35 / 40
Sparsity 2/4 Introduction ReformulationLinearization Technique RLT literature review
Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
β =set of multi-indices for monomials already in problem Every new monomial J 6∈ β yields a new variable wJ Sometimes ∃J 6∈ β s.t. wJ yields > 1 new RRLT constr. E.g.: x1 + x2 = 1 ∧ 2x1 − x2 = 3 ∧ β = {(1, 3)} one new monomial (x2 x3 ) ⇒ two new RRLT constraints
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 35 / 40
Sparsity 2/4 Introduction ReformulationLinearization Technique RLT literature review
Reduced RLT
x1 + x2 = 1 ∧ 2x1 − x2 = 3 ∧ β = {(1, 3)}
RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
β =set of multi-indices for monomials already in problem Every new monomial J 6∈ β yields a new variable wJ Sometimes ∃J 6∈ β s.t. wJ yields > 1 new RRLT constr. E.g.:
one new monomial (x2 x3 ) ⇒ two new RRLT constraints x1 + x2 = 1 (×x3 =) w13 + w23 = x3 2x1 − x2 = 3 (×x3 =) 2w13 − w23 = 3x3
Principle: one new equation, one fewer degrees of freedom
Why bother? Thank you
Compact relaxations
SEA 2012 – 35 / 40
Sparsity 2/4 Introduction ReformulationLinearization Technique RLT literature review
Reduced RLT
x1 + x2 = 1 ∧ 2x1 − x2 = 3 ∧ β = {(1, 3)}
RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
β =set of multi-indices for monomials already in problem Every new monomial J 6∈ β yields a new variable wJ Sometimes ∃J 6∈ β s.t. wJ yields > 1 new RRLT constr. E.g.:
one new monomial (x2 x3 ) ⇒ two new RRLT constraints x1 + x2 = 1 (×x3 =) w13 + w23 = x3 2x1 − x2 = 3 (×x3 =) 2w13 − w23 = 3x3
Why bother? Thank you
Compact relaxations
Principle: one new equation, one fewer degrees of freedom Create fewer J’s than new RRLT constraints
SEA 2012 – 35 / 40
Sparsity 3/4
Problem:
Introduction ReformulationLinearization Technique RLT literature review
Look for subset ρ of rows of Ax = b to be multiplied by a subset σ of P([n − 1]) such that the number of new vars wJ is < number of new RRLT constraints
Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 36 / 40
Sparsity 3/4
Problem:
Introduction
Look for subset ρ of rows of Ax = b to be multiplied by a subset σ of P([n − 1]) such that the number of new vars wJ is < number of new RRLT constraints
ReformulationLinearization Technique RLT literature review Reduced RLT RRLT literature review
New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
Formalization: consider bipartite graph (U, V, E) wL × (a1 x = b1 ) wH × (a1 x = b1 )
J¯ ¯ H
wL × (a2 x = b2 )
¯ L
wJ × (ai x = bi )
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 36 / 40
Sparsity 3/4
Problem:
Introduction
Look for subset ρ of rows of Ax = b to be multiplied by a subset σ of P([n − 1]) such that the number of new vars wJ is < number of new RRLT constraints
ReformulationLinearization Technique RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
Formalization: consider bipartite graph (U, V, E) wL × (a1 x = b1 ) wH × (a1 x = b1 )
J¯ ¯ H
wL × (a2 x = b2 )
¯ L
wJ × (ai x = bi )
U =row i by var. wJ (indexed by (i, J))
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 36 / 40
Sparsity 3/4
Problem:
Introduction
Look for subset ρ of rows of Ax = b to be multiplied by a subset σ of P([n − 1]) such that the number of new vars wJ is < number of new RRLT constraints
ReformulationLinearization Technique RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Formalization: consider bipartite graph (U, V, E) wL × (a1 x = b1 ) wH × (a1 x = b1 )
J¯ ¯ H
wL × (a2 x = b2 )
¯ L
wJ × (ai x = bi )
U =row i by var. wJ (indexed by (i, J)) ¯ V =var. wJ¯ with J¯ 6∈ β (indexed by J)
Why bother? Thank you
Compact relaxations
SEA 2012 – 36 / 40
Sparsity 3/4
Problem:
Introduction
Look for subset ρ of rows of Ax = b to be multiplied by a subset σ of P([n − 1]) such that the number of new vars wJ is < number of new RRLT constraints
ReformulationLinearization Technique RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Formalization: consider bipartite graph (U, V, E) wL × (a1 x = b1 ) wH × (a1 x = b1 )
J¯ ¯ H
wL × (a2 x = b2 )
¯ L
wJ × (ai x = bi )
U =row i by var. wJ (indexed by (i, J)) ¯ V =var. wJ¯ with J¯ 6∈ β (indexed by J) Edges: E=incidence of added vars in RRLT constrs
Why bother? Thank you
Compact relaxations
SEA 2012 – 36 / 40
Sparsity 3/4
Problem:
Introduction
Look for subset ρ of rows of Ax = b to be multiplied by a subset σ of P([n − 1]) such that the number of new vars wJ is < number of new RRLT constraints
ReformulationLinearization Technique RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Why bother? Thank you
Formalization: consider bipartite graph (U, V, E) wL × (a1 x = b1 ) wH × (a1 x = b1 )
J¯ ¯ H
wL × (a2 x = b2 )
¯ L
wJ × (ai x = bi )
U =row i by var. wJ (indexed by (i, J)) ¯ V =var. wJ¯ with J¯ 6∈ β (indexed by J) Edges: E=incidence of added vars in RRLT constrs
Aim: find induced subgraph (U ′ , V ′ , E ′ ) such that |U ′ | is maximum, |U ′ | > |V ′ |, and V ′ =neighb(U ′ )
Compact relaxations
SEA 2012 – 36 / 40
Sparsity 4/4 Introduction ReformulationLinearization Technique RLT literature review Reduced RLT RRLT literature review
Mathematical Programming formulation: P ui,J ′ max (i,J ′ )∈U P P vJ + 1 ui,J ′ ≥
New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
(i,J ′ )∈U
∀{(i, J ′ ), J} ∈ E
J6∈β
vJ ≥ ui,J ′ u ∈ {0, 1}|U | v ∈ {0, 1}|P([n])rβ| .
⊲
Why bother? Thank you
Compact relaxations
SEA 2012 – 37 / 40
Sparsity 4/4 Introduction ReformulationLinearization Technique RLT literature review Reduced RLT RRLT literature review New developments Optimal RRLT RRLT for polynomial programming 1/3 RRLT for polynomial programming 2/3 RRLT for polynomial programming 3/3 Sparsity 1/4 Sparsity 2/4 Sparsity 3/4 Sparsity 4/4
⊲
Mathematical Programming formulation: P ui,J ′ max (i,J ′ )∈U P P vJ + 1 ui,J ′ ≥ (i,J ′ )∈U
∀{(i, J ′ ), J} ∈ E
Thm. This problem is in P Proof Use matching-based algorithm
J6∈β
vJ ≥ ui,J ′ u ∈ {0, 1}|U | v ∈ {0, 1}|P([n])rβ| .
Why bother? Thank you
Compact relaxations
SEA 2012 – 37 / 40
Introduction ReformulationLinearization Technique RLT literature review Reduced RLT RRLT literature review New developments
Why bother?
⊲ Why bother? Use within sBB Thank you
Compact relaxations
SEA 2012 – 38 / 40
Use within sBB Introduction ReformulationLinearization Technique RLT literature review Reduced RLT
RRLT literature review
CPU time in sBB: number of nodes, time to solve each node Need few, small convex relaxation LPs Usually concentrate on few (tight bound) but large (valid cuts) Different approach: slacken bound, aim to solve each LP faster
New developments Why bother? Use within sBB
⊲
Thank you
Compact relaxations
SEA 2012 – 39 / 40
Use within sBB Introduction ReformulationLinearization Technique RLT literature review Reduced RLT
RRLT literature review New developments
CPU time in sBB: number of nodes, time to solve each node Need few, small convex relaxation LPs Usually concentrate on few (tight bound) but large (valid cuts) Different approach: slacken bound, aim to solve each LP faster Outcome:
Why bother? Use within sBB
⊲
– –
Thank you
Compact relaxations
bound quality: 0.07% worse; CPU improvement: 40%
Future work: embed in sBB
SEA 2012 – 39 / 40
Introduction ReformulationLinearization Technique RLT literature review Reduced RLT RRLT literature review New developments
Thank you
Why bother?
⊲ Thank you
Compact relaxations
SEA 2012 – 40 / 40