A note on an algorithm for generalized fractional programs

8 downloads 0 Views 177KB Size Report
1, we consider the nonlinear program. (P). O=inf(max [f(x)/g~(x)]), x~S \l~i~p .... BARRODALE, I., POWELL, M. J. D., and ROBERTS, F. D. F., The Differential.
JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS: Vol. 50, No. 1, JULY 1986

TECHNICAL NOTE A Note on an Algorithm for Generalized Fractional Programs ~ J. P.

C R O U Z E I X , 2 J.

A.

F E R L A N D , 3 A N D S. S C H A I B L E 4

Communicated by M. Avriel

Abstract. We present a modification of an algorithm recently suggested by the same authors in this journal (Ref. 1). The speed of convergence is improved for the same complexity of computation. Key Words. Fractional programming, convergence, speed of convergence.

multiratio

programming,

I. Introduction As in Ref. 1, we c o n s i d e r the n o n l i n e a r p r o g r a m

O=inf(max [f(x)/g~(x)]),

(P)

x~S \l~i~p

w h e r e S is a n o n e m p t y s u b s e t o f R n, the f u n c t i o n s f a n d g~ are c o n t i n u o u s on S, a n d the f u n c t i o n s gi are p o s i t i v e o n S. The a l g o r i t h m d e s c r i b e d in Section 3 o f Ref. 1 is m o d i f i e d as follows.

Step 1. Step 2.

Start with s o m e x ° c S. Let 01 = m a x i

[fi(x°)/gi(x°)]

a n d k = 1.

Solve the f o l l o w i n g p r o b l e m :

( Qk)Fk( Ok) =

inf max

[1/gi(xk-~)][f(x)-- Okgi(x)];

X~S l~i~p

let

x k be

a n o p t i m a l s o l u t i o n o f (Qk).

1The research of S. Schaible was supported by Grants A4534 and A5408 from NSERC. 2 Professor, Drpartrment de Mathrmatiques Appliqures, Universit6 de Clermont II, Aubi~re, France. 3 Professor, Drpartement d'Informatique et de Recherche Oprrationelle, Universit6 de Montrral, Montrral, Qurbec, Canada. 4 Professor, Department of Finance and Management Science, Faculty of Business, University of Alberta, Edmonton, Alberta, Canada. 183 0022-3239/86/0700-0183505.00/0 © 1986 Plenum Publishing Corporation

184

JOTA: VOL. 50, NO. 1, JULY 1986

Step 3.

If Fk(Ok) = 0, then stop.

Step 4.

If Fk(Ok) ~ 0, take

Ok+l = max [fi(xk)/gi(xk)]. i

Let k = k + l , and go back to Step 2. In the original algorithm in Ref. 1, the following problem is solved in Step 2 instead of (Qk):

F( Ok) = inf max [f~(x)-- Okg~(x)]. xES l~i~p

We shall study the convergence of the modified algorithm in the case where S is compact. In this case, the problems (Qk) have an optimal solution and the algorithm can be applied.

2. Results

We show the following theorem. Theorem 2.1. Assume that S is compact. (a) If Fk(Ok) ----0, then Ok = and x k is an optimal solution of (P). (b) The sequence {Ok}, if not finite, converges at least linearly to 0, and each convergent subsequence of {x k} converges to an optimal solution of (P). Proof.

Problem (P) can be equivalently formulated as

(P)

0 = x~sinfmaXLg,(x)/g~(xk_l) j ,

for any k = 1, 2 , . . . . Define

Fk( O) = inf max [1/ gi(xk-1) ][fi(x) -- Ogi(x) ]. x~S

i

By construction, 0k ~> ~ Proposition 2.1 (b) and (d) of Ref. 1 imply part (a) of the theorem. Now, let 2 be any optimal solution of (P). Then, Proposition 2.1 (d) and Proposition 2.2 of Ref. 1 imply that Fk(Ok) ~ F k ( O ) + ( 0 - - Ok)

min [g,(x)/gi(xk-1)].

Note that Fk(O)= 0, by Proposition 2.1(c) in Ref. 1.

(1)

JOTA: VOL. 50, NO. 1, JULY 1986

185

Since x k is an optimal solution of (Qk), then

Fk(Ok) = max

Lg~)

\ g , ( x k)

>- g~(xk) (0k+,- Ok),

(2)

~ g j ( x k-t)

for all j ~ J ( x k) = {jlfj( x k) / gj( x k) = Ok+l}. Combining (1) and (2) and rearranging terms, we obtain

Ok+l--O 0, because S is compact and g~ are positive and continuous on S. Hence, 04

Ok+ 1 - - f f ~

(O k --

if)(1 - a).

Thus, we have shown the linear convergence of the sequence {Ok } to Let Y be the limit of any convergent subsequence of the sequence {xk}. Then, ~ e S, and there exists ~ e S and an increasing sequence of positive integers {nk} such that the sequence (x,k_~, x,k ) converges to (~, ~). For y e S and ~ ~ R, define

r(y, Ix) = inf max ([ 1/g~ ( y ) ] [ f (x) -/xgi (x)]). x~S

i

Clearly, r is continuous on S x R. We have

r(x,k_l , 0,~) = max ([1/gi(x.k_l)][f(x.k) - 0,~gi(x,,)]). i

Letting k ~ +oo, we obtain

r(x~, if) = m a x ([1/g~(~)][f(Y)- ffg~(Y)]). i

In view of Proposition 2.1 of Ref. 1, then r(:~, 0 ) = 0 and Y is an optimal solution of (P). If (P) has a unique optimal solution, then the sequence {x k} generated by the algorithm is convergent. In this case, the speed of convergence of {Ok} to O is at least superlinear as shown by the following theorem.

186

JOTA: VOL. 50, NO. 1, JULY 1986

Theorem 2.2. Assume that S is compact and the sequence {Xk} converges to 2. Then, {Ok} converges superlinearly to ~ If, in addition, the functions gi satisfy a Lipschitz condition on S, then there exists a positive constant M such that

0

Suggest Documents