The variable metric algorithm is widely recognised as one of the most efficient ways of solving the following problem:- Locate x* a local minimum point n ( 1) of f(x) x E R Considerable attention has been given to the study of the convergence prop- ties of this algorithm especially for the case where analytic expressions are avai- ble for the derivatives g. = af/ax. i 1 ••• n • (2) ~ ~ In particular we shall mention the results of Wolfe (1969) and Powell (1972), (1975). Wolfe established general conditions under which a descent algorithm will converge to a stationary point and Powell showed that two particular very efficient algorithms that cannot be shown to satisfy \,olfe's conditions do in fact converge to the minimum of convex functions under certain conditions. These results will be st- ed more completely in Section 2. In most practical problems analytic expressions for the gradient vector g (Equ. 2) are not available and numerical derivatives are subject to truncation error. In Section 3 we shall consider the effects of these errors on Wolfe's convergent prop- ties and will discuss possible modifications of the algorithms to make them reliable in these circumstances. The effects of rounding error are considered in Section 4, whilst in Section 5 these thoughts are extended to include the case of on-line fu- tion minimisation where each function evaluation is subject to random noise.
Die Inhaltsangabe kann sich auf eine andere Ausgabe dieses Titels beziehen.
The variable metric algorithm is widely recognised as one of the most efficient ways of solving the following problem:- Locate x* a local minimum point n ( 1) of f(x) x E R Considerable attention has been given to the study of the convergence prop- ties of this algorithm especially for the case where analytic expressions are avai- ble for the derivatives g. = af/ax. i 1 ··· n · (2) ~ ~ In particular we shall mention the results of Wolfe (1969) and Powell (1972), (1975). Wolfe established general conditions under which a descent algorithm will converge to a stationary point and Powell showed that two particular very efficient algorithms that cannot be shown to satisfy \,olfe's conditions do in fact converge to the minimum of convex functions under certain conditions. These results will be st- ed more completely in Section 2. In most practical problems analytic expressions for the gradient vector g (Equ. 2) are not available and numerical derivatives are subject to truncation error. In Section 3 we shall consider the effects of these errors on Wolfe's convergent prop- ties and will discuss possible modifications of the algorithms to make them reliable in these circumstances. The effects of rounding error are considered in Section 4, whilst in Section 5 these thoughts are extended to include the case of on-line fu- tion minimisation where each function evaluation is subject to random noise.
„Über diesen Titel“ kann sich auf eine andere Ausgabe dieses Titels beziehen.
EUR 4,00 für den Versand innerhalb von/der Deutschland
Versandziele, Kosten & DauerGratis für den Versand innerhalb von/der Deutschland
Versandziele, Kosten & DauerAnbieter: NEPO UG, Rüsselsheim am Main, Deutschland
Taschenbuch. Zustand: Gut. nice book ex Library Sprache: Englisch Gewicht in Gramm: 550 Auflage: Softcover reprint of the original 1st ed. 1976. Artikel-Nr. 338640
Anzahl: 1 verfügbar
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Taschenbuch. Zustand: Neu. Druck auf Anfrage Neuware - Printed after ordering - The variable metric algorithm is widely recognised as one of the most efficient ways of solving the following problem:- Locate x\* a local minimum point n ( 1) of f(x) x E R Considerable attention has been given to the study of the convergence prop- ties of this algorithm especially for the case where analytic expressions are avai- ble for the derivatives g. = af/ax. i 1 --- n - (2) ~ ~ In particular we shall mention the results of Wolfe (1969) and Powell (1972), (1975). Wolfe established general conditions under which a descent algorithm will converge to a stationary point and Powell showed that two particular very efficient algorithms that cannot be shown to satisfy ,olfe's conditions do in fact converge to the minimum of convex functions under certain conditions. These results will be st- ed more completely in Section 2. In most practical problems analytic expressions for the gradient vector g (Equ. 2) are not available and numerical derivatives are subject to truncation error. In Section 3 we shall consider the effects of these errors on Wolfe's convergent prop- ties and will discuss possible modifications of the algorithms to make them reliable in these circumstances. The effects of rounding error are considered in Section 4, whilst in Section 5 these thoughts are extended to include the case of on-line fu- tion minimisation where each function evaluation is subject to random noise. Artikel-Nr. 9783540076162
Anzahl: 1 verfügbar
Anbieter: Revaluation Books, Exeter, Vereinigtes Königreich
Paperback. Zustand: Brand New. rep blg edition. 328 pages. German language. 9.60x6.69x0.74 inches. In Stock. Artikel-Nr. x-3540076166
Anzahl: 2 verfügbar
Anbieter: Ria Christie Collections, Uxbridge, Vereinigtes Königreich
Zustand: New. In. Artikel-Nr. ria9783540076162_new
Anzahl: Mehr als 20 verfügbar