103,99 €
inkl. MwSt.
Versandkostenfrei*
Versandfertig in 6-10 Tagen
payback
52 °P sammeln
  • Gebundenes Buch

Two approaches are known for solving large-scale unconstrained optimization problems-the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristics as well as their performance in solving large-scale unconstrained optimization problems and applications. Comparisons to the limited-memory and truncated Newton methods are also discussed. Topics studied in detail include: linear conjugate gradient methods, standard conjugate gradient methods,…mehr

Produktbeschreibung
Two approaches are known for solving large-scale unconstrained optimization problems-the limited-memory quasi-Newton method (truncated Newton method) and the conjugate gradient method. This is the first book to detail conjugate gradient methods, showing their properties and convergence characteristics as well as their performance in solving large-scale unconstrained optimization problems and applications. Comparisons to the limited-memory and truncated Newton methods are also discussed. Topics studied in detail include: linear conjugate gradient methods, standard conjugate gradient methods, acceleration of conjugate gradient methods, hybrid, modifications of the standard scheme, memoryless BFGS preconditioned, and three-term. Other conjugate gradient methods with clustering the eigenvalues or with the minimization of the condition number of the iteration matrix, are also treated. For each method, the convergence analysis, the computational performances and thecomparisons versus other conjugate gradient methods are given.
The theory behind the conjugate gradient algorithms presented as a methodology is developed with a clear, rigorous, and friendly exposition; the reader will gain an understanding of their properties and their convergence and will learn to develop and prove the convergence of his/her own methods. Numerous numerical studies are supplied with comparisons and comments on the behavior of conjugate gradient algorithms for solving a collection of 800 unconstrained optimization problems of different structures and complexities with the number of variables in the range [1000,10000]. The book is addressed to all those interested in developing and using new advanced techniques for solving unconstrained optimization complex problems. Mathematical programming researchers, theoreticians and practitioners in operations research, practitioners in engineering and industry researchers, as well as graduate students in mathematics, Ph.D. and masterstudents in mathematical programming, will find plenty of information and practical applications for solving large-scale unconstrained optimization problems and applications by conjugate gradient methods.

Autorenporträt
Neculai Andrei holds a position at the Center for Advanced Modeling and Optimization at the Academy of Romanian Scientists in Bucharest, Romania. Dr. Andrei's areas of interest include mathematical modeling, linear programming, nonlinear optimization, high performance computing, and numerical methods in mathematical programming. In addition to this present volume, Neculai Andrei has published 2 books with Springer including Continuous Nonlinear Optimization for Engineering Applications in GAMS Technology (2017) and Nonlinear Optimization Applications Using the GAMS Technology (2013).
Rezensionen
"The book is well written for understanding several kinds of nonlinear CG methods and their
convergence properties. ... The book will be very useful for researchers, graduate students and practitioners interested in studying nonlinear CG methods." (Hiroshi Yabe, Mathematical Reviews, April, 2022)