Some Numerical Results Using a Sparse Matrix Updating Formula in Unconstrained Optimization

Type: Article

Publication Date: 1978-07-01

Citations: 9

DOI: https://doi.org/10.2307/2006489

Abstract

This paper presents a numerical comparison between algorithms for unconstrained optimization that take account of sparsity in the second derivative matrix of the objective function.Some of the methods included in the comparison use difference approximation schemes to evaluate the second derivative matrix and others use an approximation to it which is updated regularly using the changes in the gradient.These results show what method to use in what circumstances and also suggest interesting future developments.1. Introduction.Numerical procedures for finding the minimum of a general function of several variables have been studied extensively in the last few years; and many powerful algorithms have been proposed, which require the calculation of the vector of first derivatives (see Davidon [2], Fletcher and Powell [4], Huang [6],Powell [8], for example).One common feature of these procedures is the fact that they use a matrix which can be regarded as an approximation to the second derivative matrix of the objective function.This matrix is updated regularly, so that the updated matrix satisfies a linear equation which is known usually as the "quasi-Newton equation" or "DFP condition".Because all the classical updating formulae revise all the elements of the matrix, the size of the problem that can be treated is often limited by the amount of computer storage that is available.However, a way to take any present sparsity into account has been suggested by Powell [11].It does not change the elements of the approximating matrix corresponding to known zero entries in the hessian of the objective function.Toint [14] shows that the calculation can be arranged so that its main part is only the solution of a sparse, symmetric and positive definite system of linear equations.The main purpose of this paper is to present numerical results that compare this procedure with some other methods that also use first derivative information, in order to show the actual and computational value of the proposed method.The other methods included in the comparison are Powell's method [8], since the new formula may be viewed as an extension of it to the sparse case, and methods using difference approximations to the hessian.When sparsity conditions

Locations

  • Mathematics of Computation - View - PDF

Similar Works

Action Title Year Authors
+ PDF Chat Some numerical results using a sparse matrix updating formula in unconstrained optimization 1978 Philippe L. Toint
+ On a new method for sparse matrix updating in unconstrained minimization 1981 Ángelo Lucia
Leroy F. Stutzman
+ A note on quasi-newton formulae for sparse second derivative matrices 1981 M. J. D. Powell
+ PDF Chat On Sparse and Symmetric Matrix Updating Subject to a Linear Equation 1977 Philippe L. Toint
+ The Shanno-Toint Procedure for Updating Sparse Symmetric Matrices 1981 M. J. D. Powell
Philippe L. Toint
+ A diagonal quasi-Newton updating method for unconstrained optimization 2018 Neculai Andrei
+ On the Modifications of a Broyden's Single Parameter Rank-Two Quasi-Newton Method for Unconstrained Minimization 1999 Wah June Leong
+ A new class of quasi-Newton updating formulas 2008 Donghui Li
Liqun Qi
Vera Roshchina
+ PDF Chat New Class of Rank 1 Update for Solving Unconstrained Optimization Problem 2022 Saad Shakir Mahmood
Jaafer Hmood Eidi
+ PDF Chat An Overview of Some Practical Quasi-Newton Methods for Unconstrained Optimization 2007 M. Al-Baali
Humaid Khalfan
+ A Diagonal-Sparse Quasi-Newton Method for Unconstrained Optimization Problem 2006 Shi Zhen-jun
Sun Guo
+ A compact updating formula for quasi-Newton minimization algorithms 1982 Lucio Grandinetti
+ MODIFICATION OF PSB QUASI-NEWTON UPDATE AND ITS GLOBAL CONVERGENCE FOR SOLVING SYSTEMS OF NONLINEAR EQUATIONS 2021 Muhammad Kabir Dauda
+ A comparison of Quasi-Newton methods considering line search conditions in unconstrained minimization 2022 Kadir Kiran
+ A Comparison of Newton’s Method and Two of its Modifications for the Solution of Nonlinear Optimization Problems 2017 Taiwo Lukumon Abiodun
Olusesan Adeyemi Adelabu
+ On the Estimation of Sparse Hessian Matrices 1979 M. J. D. Powell
Philippe L. Toint
+ PDF Chat A new class of BFGS updating formula based on the new quasi-newton equation 2019 Basim A. Hassan
Hussein K. Khalo
+ Unconstrained optimization methods based on direct updating of Hessian factors 2003 Saad Shakir Mahmood
Promotor Prof.Dr.Ir. Prayoto
+ A family of quasi-Newton methods for unconstrained optimization problems 2018 M.S. Salim
A. I. Ahmed
+ Derivative free Davidon-Fletcher-Powell (DFP) for solving symmetric systems of nonlinear equations 2018 Mustafa Mamat
Mohammed Dauda
M. A. Mohamed
Mohammed Yusuf Waziri
Fatma Susilawati Mohamad
Hishyar Kh. Abdullah