Fully describes optimization methods that are currently most valuable in solving real-life problems. Since optimization has applications in almost every branch of science and technology, the text emphasizes their practical aspects in conjunction with the heuristics useful in making them perform more reliably and efficiently. To this end, it presents comparative numerical studies to give readers a feel for possibile applications and to illustrate the problems in assessing evidence. Also provides theoretical background which provides insights into how methods are derived. This edition offers revised coverage of basic theory and standard techniques, with updated discussions of line search methods, Newton and quasi-Newton methods, and conjugate direction methods, as well as a comprehensive treatment of restricted step or trust region methods not commonly found in the literature. Also includes recent developments in hybrid methods for nonlinear least squares; an extended discussion of linear programming, with new methods for stable updating of LU factors; and a completely new section on network programming. Chapters include computer subroutines, worked examples, and study questions.
Die Inhaltsangabe kann sich auf eine andere Ausgabe dieses Titels beziehen.
This established textbook is noted for its coverage of optimization methods that are of practical importance. It provides a thorough treatment of standard methods such as linear and quadratic programming, Newton-like methods and the conjugate gradient method. The theoretical aspects of the subject include an extended treatment of optimality conditions and the significance of Lagrange multipliers. The relevance of convexity theory to optimization is also not neglected. A significant proportion of the book is devoted to the solution of nonlinear problems, with an authoritative treatment of current methodology. Thus state of the art techniques such as the BFGS method, trust region methods and the SQP method are described and analysed. Other features are an extensive treatment of nonsmooth optimization and the L_1 penalty function. Contents Part 1 Unconstrained Optimization Part 2 Constrained Optimization
* Structure of Methods
* Newton-like Methods
* Conjugate Direction Methods
* Restricted Step Methods
* Sums of Squares and Nonlinear Equations
* Linear Programming
* The Theory of Constrained Optimization
* Quadratic Programming
* General Linearly Constrained Optimization
* Nonlinear Programming
* Other Optimization Problems
About the author Professor Roger Fletcher completed his MA at the University of Cambridge in 1960 and his PhD at the University of Leeds in 1963. He was a lecturer at the University of Leeds from 1963 to 1969, then Principal Scientific Officer at AERE Harwell until 1973. He then joined the University of Dundee where he is Professor of Optimization and holds the Baxter Chair of Mathematics. In 1997 he was awarded the prestigious Dantzig Prize for fundamental contributions to algorithms for nonlinear optimization, awarded jointly by the Society for Industrial and Applied Mathematics and the Mathematical Programming Society. He is a Fellow of the Royal Society of Edinburgh and of the Institute of Mathematics and its Applications.
„Über diesen Titel“ kann sich auf eine andere Ausgabe dieses Titels beziehen.