Efficient Algorithm for Minimization of Non-Linear ... - Semantic Scholar

2 downloads 0 Views 137KB Size Report
4(3):227-234 (2007). [10] Vinay Kanwar, Sukhjit Singh, J.R.Sharma and Mamta: New numerical techniques for solving non-linear equations. Indian J. pure appl.
Applied Mathematical Sciences, Vol. 4, 2010, no. 69, 3437 - 3446

Efficient Algorithm for Minimization of Non-Linear Function K. Karthikeyan School of Advanced Sciences, VIT University Vellore- 632014, India [email protected] Abstract In this paper, we suggest new algorithm for minimization of non linear function. Then the comparative study of the new algorithm with the Newton’s Algorithm and External Touch technique algorithm is established by means of examples. Key words: Newton’s Algorithm, External Touch technique algorithm, Iterative method, Adomian’s method.

1. INTRODUCTION Optimization problems with or without constraints arise in various fields such as science, engineering, economics, management sciences, etc., where numerical information is processed. In recent times, many problems in business situations and engineering designs have been modeled as an optimization problem for taking optimal decisions. In fact, numerical optimization techniques have made deep in to almost all branches of engineering and mathematics. An unconstrained minimization problem is the one where a value of the vector x is sought that minimizes the objective function f(x). This problem can be considered as particular case of the general constrained non-linear programming problem. The study of unconstrained minimization techniques provide the basic understanding necessary for the study of constrained minimization methods and this method can be used to solve certain complex engineering analysis problem. For example, the displacement response (linear or non-linear) of any structure under any specified load condition can be found by minimizing its potential energy.

3438

K. Karthikeyan

To solve unconstrained nonlinear minimization problems arising in the diversified field of engineering and technology, we have several methods to get solutions. For instance, multi- step nonlinear conjugate gradient methods [5], a scaled nonlinear conjugate gradient algorithm[2], a method called, ABS-MPVT algorithm [7] are used for solving unconstrained optimization problems. Newton’s method [8] is used for various classes of optimization problems, such as unconstrained minimization problems, equality constrained minimization problems. Chun[4] and Basto[3] have proposed and studied several methods for non linear equations with higher order convergence by using the decomposition technique of Adomian[1,9]. Vinay Kanwar et al. [10] introduced new algorithm called, external touch technique for solving the non linear equations. Further, they did the comparative study of the new algorithms and Newton’s algorithm. J.H. Yun[11] proposed a note on three-iterative method for nonlinear equations. Recently, Jishe Feng [6] introduced two step iterative method for solving nonlinear equation with the comparative study of the new iterative method with Newton’s Algorithm, Abbasbandy method and Basto method. In this paper, we suggest new algorithm for minimization of non linear functions. Then, we present the comparative study among the new algorithms, Newton’s algorithm and External Touch technique algorithm by means of examples.

2. NEW ALGORITHMS Consider the nonlinear optimization problem : Minimize { f ( x), x ∈ R, f : R → R } where f is a non-linear twice differentiable function. 2.1. External Touch Technique Algorithm: Consider the function G ( x) = x − ( g ( x) g ′( x)) where g ( x) = f ′( x) . Here f(x) is the function to be minimized. G ′( x) is defined around the critical point x* of f(x) if g ′( x ∗ ) = f ′′( x ∗ ) ≠ 0 and is given by G ′( x) = g ( x) g ′′( x) g ′( x) . If we assume that g ′′( x ∗ ) ≠ 0 , we have G ′( x ∗ ) = 0 iff g ( x ∗ ) = 0 . Consider the equation g ( x) = 0 ___(1) whose one or more roots are to be found. y = g (x) represents the graph of the function g (x) and assume that an initial estimate x0 is known for the desired root of the equation g ( x) = 0 . A circle C1 of radius g ( x0 + h) is drawn with centre at any point ( x0 + h, g ( x0 + h)) on the curve of the function y = g (x) where h is small positive or negative quantity. Another circle C2

Efficient algorithm for minimization

3439

with radius g ( x0 − h) and centre at ( x0 − h, g ( x0 − h)) is drawn on the curve of the function g(x) such that it touches the circle C1 externally. Let x1 = x0 + h , h < 1 . Since the circles C2 and C1 touches externally, we have h 2 = g ( x0 + h) g ( x0 − h) Expanding g ( x0 + h) and g ( x0 − h) by Taylor’s series ( omitting fourth and higher powers of h ) and simplifying, we can conclude that h=±

g ( xο ) 1 + g ′ ( xο ) − g ( xο ) g ′′( xο ) 2

___ (2) where h can be taken

positive or negative according as x0 lies in the left or right of true root or slope of the curve at ( x0 , g ( x0 )) is positive or negative. If x0 lies in the left of true root, then h is taken as positive otherwise negative. Therefore, we get the first approximation to the root as x1 = x0 ± h

x1 = x0 ±

ie.,

g ( xο ) 1 + g ′ 2 ( xο ) − g ( xο ) g ′′( xο )

Since g ( x) = f ′( x) , it follows that x1 = x0 ±

f ′( xο ) 1 + f ′′ 2 ( xο ) − f ′( xο ) f ′′′( xο )

The general iteration formula for successive approximate minimizing point of the non-linear function f is x n +1 = x n ±

f ′( x n ) 1 + f ′′ ( x n ) − f ′( x n ) f ′′′( x n ) 2

___(3)

The iterative method (3) has quadratic convergence[10].

2.2. New Algorithm - (Efficient Two Stage Iterative Algorithm) Consider the non-linear equation g(x) = 0 and writing g(x + h) in Taylor’s series expansion about x, we obtain g ( x + h) = g ( x) + h g ′( x) + φ (h) φ (h) = g ( x + h) − g ( x) − h g ′( x) . Suppose that g ′( x) ≠ 0 , one searches for a value of h such that g(x + h) = 0 g ( x) + h g ′( x) + φ (h) = 0 h g ′( x) = − g ( x) − φ (h)

3440

K. Karthikeyan

φ ( h) − g ( x) ____(4) − g ′( x) g ′( x) Equation (4) can be rewritten as h = c + N (h) ____ (5) − g ( x) − φ ( h) and N (h) = c = where g ′( x) g ′( x) − [ g ( x + h) − g ( x) − h g ′( x) ] N (h) = g ′( x) Here c is a constant and N(h) is a non linear function. When applying Adomian’s method to (5), we use the technique of Basto and obtain − N ′(c + S ∗ ) S + N (c + S ) S = 1 − N ′(c + S ∗ ) Apply the Adomian method[7] to (5), we have h

=

N (c ) g ( x + c) = − 1 − N ′(c) 2 g ′( x) − g ′( x + c) Now we construct the iterative method, − g ( x) g ( x) obtains h + x ≈ x − For h ≈ h0 = g ′( x) g ′( x) g ( xn ) which yields the Newton’s method x n + 1 = x n − g ′( x n ) For k = 1, one obtains h ≈ h0 + N (h0 ) which suggest the following two step iterative method[6], g ( xn ) yn = xn − g ′( x n ) g ( yn ) xn + 1 = y n − 2 g ′( x n ) − g ′( y n ) Since g ( x) = f ′( x) , the above iterative procedure becomes f ′( x n ) yn = xn − f ′′( x n ) f ′( y n ) which is the required algorithm. xn + 1 = y n − 2 f ′′( x n ) − f ′′( y n ) A0 = N (h0 ) = N (c) =

Theorem: Let α ∈ I be a zero of a sufficiently differentiable function g: I → R for an open interval I. If x0 is sufficiently close to α , then the new algorithm has quadratic convergence.

Efficient algorithm for minimization

3441

Proof: Since g is sufficiently differentiable, by expanding g ( x n ) and g ′( x n ) about α , we have g ( x n ) = g ′(α )(en + c 2 en2 + c3 en3 + c 4 en4 + O(en5 )) ___(6) g ′( x n ) = g ′(α ) ( 1 + 2 c 2 en + 3 c3 en2 + 4 c 4 en3 + O (en4 ))

where en = x n − α , c k =

___(7)

g ( k ) (α ) , k = 2, 3, .... k! g ′(α )

From (6) and (7), we have − g ( xn ) = − en + c 2 en2 − 2(c 22 − c3 ) en3 − (7 c 2 c3 − 3 c 4 − 4 c 23 ) + O(en5 ) g ′( x n )

y n = α + c 2 en2 − 2 (c 22 − c3 ) en3 − ( 7 c 2 c3 − 3 c 4 − 4 c 23 ) en4 + O(e n5 ) __(8) Expanding g ( y n ) about α , using (3) we have

g ( y n ) = g ′(α )( c 2 en2 − 2 (c 22 − c3 ) en3 − ( 7 c 2 c3 − 3 c 4 − 5 c 23 ) en4 + O(en5 )) __(9) From (7) and (9) g ′( y n ) = g ′(α )(2c 2 en − 6(c 22 − c3 )en2 − 4(7c 2 c3 − 3c 4 − 5c 23 ) en3 + O(en4 ) __(10) 2 g ′( x n ) − g ′( y n ) = g ′(α ) (2 + 2 c 2 en + 6 c 22 en2 + (28 c 2 c3 − 4c 4 − 20 c 23 )en3 + O(en4 )

___(11)

and (6), we have g ( yn ) c e2 − 9 c 2 c3 + 3 c 4 + 5 c 23 4 c2 = 2 n + (c3 − 2 )en3 + en 2 g ′( x n ) − g ′( y n ) 2 2 2 + O (en5 )

From (4)

From (8) and

___(12)

(12), we have c2 2 3c 2 en + (c3 − 2 ) en3 + ....... 2 2 which implies that the new iterative method has quadratic convergence. xn +1 = α +

3. NUMERICAL EXAMPLES We present numerical examples to illustrate the efficiency of the new algorithm and compared with the Newton’s Algorithm and External touch technique of circles. Newton’s Iterative method converges if we choose the initial guess x0 very close to the root α . The order of convergence of External Touch technique algorithm[10] is also quadratic. But in some cases, the rate of convergence of External Touch technique

3442

K. Karthikeyan

algorithm is faster than Newton’s Algorithm. Though our new algorithm[6] is also quadratic convergence, the rate of convergence is faster than Newton’s method and External Touch technique in most of the cases which is clear from the following examples.

Example 3.1: Consider the function f ( x) = x 3 − 2 x − 5, x ∈ R . Then, minimizing point of the function is equal to 0.816467 which is obtained in 3 iterations by Newton, method, in 4 iterations by External Touch Technique algorithm and in 2 iterations by New algorithm for the initial value x0 = 1and also seen the variations in iteration for the convergence of initial value x0 = 2

Iterations 0 1 2 3 4.

1.000000 0.833333 0.816667 0.816467

Iterations 0 1 2 3 4. 5.

Newton’s Algorithm

Newton’s Algorithm 2.000000 1.166667 0.869048 0.818085 0.816498 0.816497

External Touch Technique algorithm 1.000000 0.820395 0.816575 0.816498 0.816497

External Touch Technique algorithm 2.000000 0.915348 0.818430 0.816536 0.816497

New Algorithm 1.000000 0.821429 0.816497

New Algorithm 2.000000 1.044118 0.824814 0.816497

Example 3.2: Consider the function f ( x) = x 4 − x − 10 x ∈ R . Then, minimizing point of function is equal to 0.629961 which is obtained in 4 iterations by External Touch Technique Algorithm , 4 iterations by Newton’s algorithm and in 3 iterations by New algorithm for the initial value x0 = 1 and also seen the variations in iteration for the convergence of initial value x0 = 2.

Efficient algorithm for minimization

Iterations

Newton’s algorithm

0 1 2 3 4 Iterations

1.000000 0.750000 0.648148 0.630466 0.629961 Newton’s algorithm

0 1 2 3 4 5 6

2.000000 1.354167 0.948222 0.724830 0.641836 0.630179 0.629961

3443

External Touch Technique algorithm 1.000000 0.648877 0.630344 0.629969 0.629961 External Touch Technique algorithm 2.000000 0.915447 0.640668 0.630182 0.629965 0.629961

New Algorithm 1.000000 0.710145 0.632837 0.629961 New Algorithm 2..000000 1.233443 0.814804 0.650158 0.630031 0.629961

Example 3.3: Consider the function f ( x) = xe x − 1 , x ∈ R . Then, minimizing point of the function is equal to -1 which is obtained in 7 iterations by Newton’s algorithm, 26 iterations by External Touch Technique algorithm and in 5 iterations by new algorithm for the initial value x0 = 1 and also seen the variations in iteration for the convergence of initial value x0 = 2. Iterations

Newton’s algorithm

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

1.000000 0.333333 -0.238095 -0.670528 -0.918350 -0.993836 -0.999962 -1

External Touch Technique algorithm 1.000000 -0.877016 -0.924255 -0.952193 -0.969405 -0.980255 -0.987192 -0.991664 -0.994563 -0.996449 -0.997679 -0.998482 -0.999007 -0.999350 -0.999575 -0.999722 -0.999818 -0.999881 -0.999922

New Algorithm 1.000000 0.190778 -0.462498 -0.871295 -0.995009 -1.000000

3444

K. Karthikeyan

19 20 21 22 23 24 25 26

-0.999949 -0.999967 -0.999978 -0.999986 -0.999991 -0.999996 -0.999999 -1.000000

Iterations

Newton’s algorithm

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25

2.000000 1.250000 0.557692 -0.051330 -0.538159 -0.854409 -0.981421 -0.999661 -0.999999 -1.000000

External Touch Technique algorithm 2.000000 -0.972898 -0.982481 -0.988623 -0.992591 -0.995166 -0.996842 -0.997935 -0.998649 -0.999116 -0.999422 -0.999621 -0.999752 -0.999838 -0.999894 -0.999930 -0.999954 -0.999970 -0.999980 -0.999987 -0.999992 -0.999995 -0.999996 -0.999998 -0.999999 -1.000000

New Algorithm 2.000000 1.085598 0.264683 -0.407467 -0.844374 -0.991951 -0.999998 -1.000000

4. CONCLUSION In this paper, we have introduced new algorithm for minimization of non linear functions and compared with the algorithms Newton’s algorithm and External Touch

Efficient algorithm for minimization

3445

Technique algorithm by numerical experiments. It is clear from the numerical experiments that the rate of convergence of new algorithm is better than other two algorithms.

RERERENCES [1] G.Adomian: A review of the decomposition method and some recent results for non linear equations. Comput. Math. Appln. 21(5): 101 – 127 (1991). [2] N.Andrei: A scaled nonlinear conjugate gradient algorithm for Minimization, Optimization 57(4): 549 – 570 (2008).

unconstrained

[3] M.Basto, V.Semiano, F.L.Calheiros: A new iterative method to compute nonlinear equations. Appl. Math.Comput. 173 : 468-483(2006). [4] C.Chun: Iterative methods improving Newton’s method by the decomposition method. Comput. Math. Appl. 50:1559 – 1568 (2005). [5] J.A.Ford., Y.Narushima and H.Yabe: Multi-step nonlinear conjugate gradient methods for constrained minimization, Computational Optimization and application , 40(2): 191 -216 (2008). [6] Jishe Feng: A New Two-step Method for Solving Nonlinear Equations. International Journal of Nonlinear Science, 8(1): 40 – 44 (2009). [7] L.P.Pang, E.Spedicato, Z.Q. Xia and W.Wang: A method for solving the system of linear equations and linear inequalities, Mathematical and Computer Modelling, 46(5-6): 823 – 836(2007). [8] B.T.Polyak: Newton’s method and its use in optimization, European Journal of Operational research 181(3): 1086 – 1096 (2007). [9] Santanu Saha Ray: Solution of the coupled Klein-Gordon Schrodinger equation using the modified decomposition method. International Journal of Nonlinear Science. 4(3):227-234 (2007). [10] Vinay Kanwar, Sukhjit Singh, J.R.Sharma and Mamta: New numerical techniques for solving non-linear equations. Indian J. pure appl. Math. 34(9): 1339 1349 (2003).

3446

K. Karthikeyan

[11] J.H.Yun: A note on three-step iterative method for nonlinear equations. Appl. Math. Comput. doi:10.1016/j.amc.2008.02.002(2008).

Received: July, 2010

Suggest Documents