Neural Networks for Conditional Probability Estimation

Dirk Husmeier

ISBN 10: 1852330953 ISBN 13: 9781852330958
Verlag: Springer London, 1999
Neu Softcover

Verkäufer moluna, Greven, Deutschland Verkäuferbewertung 5 von 5 Sternen 5 Sterne, Erfahren Sie mehr über Verkäufer-Bewertungen

AbeBooks-Verkäufer seit 9. Juli 2020


Beschreibung

Beschreibung:

Dieser Artikel ist ein Print on Demand Artikel und wird nach Ihrer Bestellung fuer Sie gedruckt. Provides unique, comprehensive coverage of generalisation and regularisation: Provides the first real-world test results for recent theoretical findings on the generalisation performance of committeesConventional applications of neural networks usually . Bestandsnummer des Verkäufers 4289360

Diesen Artikel melden

Inhaltsangabe:

Conventional applications of neural networks usually predict a single value as a function of given inputs. In forecasting, for example, a standard objective is to predict the future value of some entity of interest on the basis of a time series of past measurements or observations. Typical training schemes aim to minimise the sum of squared deviations between predicted and actual values (the 'targets'), by which, ideally, the network learns the conditional mean of the target given the input. If the underlying conditional distribution is Gaus­ sian or at least unimodal, this may be a satisfactory approach. However, for a multimodal distribution, the conditional mean does not capture the relevant features of the system, and the prediction performance will, in general, be very poor. This calls for a more powerful and sophisticated model, which can learn the whole conditional probability distribution. Chapter 1 demonstrates that even for a deterministic system and 'be­ nign' Gaussian observational noise, the conditional distribution of a future observation, conditional on a set of past observations, can become strongly skewed and multimodal. In Chapter 2, a general neural network structure for modelling conditional probability densities is derived, and it is shown that a universal approximator for this extended task requires at least two hidden layers. A training scheme is developed from a maximum likelihood approach in Chapter 3, and the performance ofthis method is demonstrated on three stochastic time series in chapters 4 and 5.

Reseña del editor: Conventional applications of neural networks usually predict a single value as a function of given inputs. In forecasting, for example, a standard objective is to predict the future value of some entity of interest on the basis of a time series of past measurements or observations. Typical training schemes aim to minimise the sum of squared deviations between predicted and actual values (the 'targets'), by which, ideally, the network learns the conditional mean of the target given the input. If the underlying conditional distribution is Gaus­ sian or at least unimodal, this may be a satisfactory approach. However, for a multimodal distribution, the conditional mean does not capture the relevant features of the system, and the prediction performance will, in general, be very poor. This calls for a more powerful and sophisticated model, which can learn the whole conditional probability distribution. Chapter 1 demonstrates that even for a deterministic system and 'be­ nign' Gaussian observational noise, the conditional distribution of a future observation, conditional on a set of past observations, can become strongly skewed and multimodal. In Chapter 2, a general neural network structure for modelling conditional probability densities is derived, and it is shown that a universal approximator for this extended task requires at least two hidden layers. A training scheme is developed from a maximum likelihood approach in Chapter 3, and the performance ofthis method is demonstrated on three stochastic time series in chapters 4 and 5.

„Über diesen Titel“ kann sich auf eine andere Ausgabe dieses Titels beziehen.

Bibliografische Details

Titel: Neural Networks for Conditional Probability ...
Verlag: Springer London
Erscheinungsdatum: 1999
Einband: Softcover
Zustand: New

Beste Suchergebnisse beim ZVAB

Foto des Verkäufers

Husmeier, Dirk:
ISBN 10: 1852330953 ISBN 13: 9781852330958
Gebraucht Softcover

Anbieter: Roland Antiquariat UG haftungsbeschränkt, Weinheim, Deutschland

Verkäuferbewertung 5 von 5 Sternen 5 Sterne, Erfahren Sie mehr über Verkäufer-Bewertungen

Softcover. XXIII, 275 S. : graph. Darst. ; 24 cm Like new. Unread book. --- Neuwertiger Zustand. Ungelesenes Buch. 9781852330958 Sprache: Deutsch Gewicht in Gramm: 467 Softcover reprint of the original 1st ed. 1999. Artikel-Nr. 200027

Verkäufer kontaktieren

Gebraucht kaufen

EUR 56,00
Währung umrechnen
Versand: EUR 2,50
Innerhalb Deutschlands
Versandziele, Kosten & Dauer

Anzahl: 1 verfügbar

In den Warenkorb

Foto des Verkäufers

Dirk Husmeier
Verlag: Springer London, 1999
ISBN 10: 1852330953 ISBN 13: 9781852330958
Neu Taschenbuch

Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland

Verkäuferbewertung 5 von 5 Sternen 5 Sterne, Erfahren Sie mehr über Verkäufer-Bewertungen

Taschenbuch. Zustand: Neu. Druck auf Anfrage Neuware - Printed after ordering - Conventional applications of neural networks usually predict a single value as a function of given inputs. In forecasting, for example, a standard objective is to predict the future value of some entity of interest on the basis of a time series of past measurements or observations. Typical training schemes aim to minimise the sum of squared deviations between predicted and actual values (the 'targets'), by which, ideally, the network learns the conditional mean of the target given the input. If the underlying conditional distribution is Gaus sian or at least unimodal, this may be a satisfactory approach. However, for a multimodal distribution, the conditional mean does not capture the relevant features of the system, and the prediction performance will, in general, be very poor. This calls for a more powerful and sophisticated model, which can learn the whole conditional probability distribution. Chapter 1 demonstrates that even for a deterministic system and 'be nign' Gaussian observational noise, the conditional distribution of a future observation, conditional on a set of past observations, can become strongly skewed and multimodal. In Chapter 2, a general neural network structure for modelling conditional probability densities is derived, and it is shown that a universal approximator for this extended task requires at least two hidden layers. A training scheme is developed from a maximum likelihood approach in Chapter 3, and the performance ofthis method is demonstrated on three stochastic time series in chapters 4 and 5. Artikel-Nr. 9781852330958

Verkäufer kontaktieren

Neu kaufen

EUR 59,97
Währung umrechnen
Versand: Gratis
Innerhalb Deutschlands
Versandziele, Kosten & Dauer

Anzahl: 1 verfügbar

In den Warenkorb

Beispielbild für diese ISBN

Husmeier, Dirk
Verlag: Springer, 1999
ISBN 10: 1852330953 ISBN 13: 9781852330958
Neu Softcover

Anbieter: Ria Christie Collections, Uxbridge, Vereinigtes Königreich

Verkäuferbewertung 5 von 5 Sternen 5 Sterne, Erfahren Sie mehr über Verkäufer-Bewertungen

Zustand: New. In. Artikel-Nr. ria9781852330958_new

Verkäufer kontaktieren

Neu kaufen

EUR 62,15
Währung umrechnen
Versand: EUR 5,91
Von Vereinigtes Königreich nach Deutschland
Versandziele, Kosten & Dauer

Anzahl: Mehr als 20 verfügbar

In den Warenkorb

Beispielbild für diese ISBN

Dirk Husmeier
Verlag: Springer, 1999
ISBN 10: 1852330953 ISBN 13: 9781852330958
Neu Paperback

Anbieter: Revaluation Books, Exeter, Vereinigtes Königreich

Verkäuferbewertung 5 von 5 Sternen 5 Sterne, Erfahren Sie mehr über Verkäufer-Bewertungen

Paperback. Zustand: Brand New. 275 pages. 9.50x6.25x0.75 inches. In Stock. Artikel-Nr. x-1852330953

Verkäufer kontaktieren

Neu kaufen

EUR 80,31
Währung umrechnen
Versand: EUR 11,87
Von Vereinigtes Königreich nach Deutschland
Versandziele, Kosten & Dauer

Anzahl: 2 verfügbar

In den Warenkorb