On Iterated-Subspace Minimization Methods for Nonlinear Optimization
by A.R. Conn1 , Nick Gould2 , A. Sartenaer3 and Ph.L. Toint3 Report 94/13
1
IBM T.J. Watson Research Center, P.O.Box 218, Yorktown Heights, NY 10598, USA Email :
[email protected] 2
3
May 18, 1994
Rutherford Appleton Laboratory, Chilton, Oxfordshire, England Email :
[email protected] Current reports available by anonymous ftp from the directory \pub/reports" on camelot.cc.rl.ac.uk (internet 130.246.8.61)
Department of Mathematics, Facultes Universitaires ND de la Paix, 61, rue de Bruxelles, B-5000 Namur, Belgium, EC Email :
[email protected] and
[email protected] Current reports available by anonymous ftp from the directory \pub/reports" on thales.math.fundp.ac.be (internet 138.48.4.14)
Keywords : Unconstrained optimization, large-scale computation, convergence theory. Mathematics Subject Classi cations: 65K05, 90C30
On Iterated-Subspace Minimization Methods for Nonlinear Optimization A. R. Conn, Nick Gould, A. Sartenaer and Ph. L. Toint May 18, 1994 Abstract We consider a class of Iterated-Subspace Minimization (ISM) methods for solving largescale unconstrained minimization problems. At each major iteration of such a method, a lowdimensional manifold, the iterated subspace, is constructed and an approximate minimizer of the objective function in this manifold then determined. The iterated subspace is chosen to contain vectors which ensure global convergence of the overall scheme and may also contain vectors which encourage fast asymptotic convergence. We demonstrate the ecacy of this approach on a collection of large problems and indicate a number of avenues of future research.
1 Introduction In this paper, we consider nding a local solution of the unconstrained minimization problem, minimize f (x);
x2