One of the most challenging and fascinating problems of the theory of neural nets is that of asymptotic behavior, of how a system behaves as time proceeds. This is of particular relevance to many practical applications. Here we focus on association, generalization, and representation. We turn to the last topic first. The introductory chapter, "Global Analysis of Recurrent Neural Net works," by Andreas Herz presents an in-depth analysis of how to construct a Lyapunov function for various types of dynamics and neural coding. It includes a review of the recent work with John Hopfield on integrate-and fire neurons with local interactions. The chapter, "Receptive Fields and Maps in the Visual Cortex: Models of Ocular Dominance and Orientation Columns" by Ken Miller, explains how the primary visual cortex may asymptotically gain its specific structure through a self-organization process based on Hebbian learning. His argu ment since has been shown to be rather susceptible to generalization.
Die Inhaltsangabe kann sich auf eine andere Ausgabe dieses Titels beziehen.
One of the most challenging and fascinating problems of the theory of neural nets is that of asymptotic behavior, of how a system behaves as time proceeds. This is of particular relevance to many practical applications. Here we focus on association, generalization, and representation. We turn to the last topic first. The introductory chapter, "Global Analysis of Recurrent Neural Net works," by Andreas Herz presents an in-depth analysis of how to construct a Lyapunov function for various types of dynamics and neural coding. It includes a review of the recent work with John Hopfield on integrate-and fire neurons with local interactions. The chapter, "Receptive Fields and Maps in the Visual Cortex: Models of Ocular Dominance and Orientation Columns" by Ken Miller, explains how the primary visual cortex may asymptotically gain its specific structure through a self-organization process based on Hebbian learning. His argu ment since has been shown to be rather susceptible to generalization.
This collection of articles by leading researchers in neural networks responds to the urgent need for timely and comprehensive reviews in a multidisciplinary, rapidly developing field of research. It continues the themes of the previous volume, but shifts its focus to more practical matters, such as data storage and retrieval, and the recognition of handwriting.
„Über diesen Titel“ kann sich auf eine andere Ausgabe dieses Titels beziehen.
Gratis für den Versand innerhalb von/der Deutschland
Versandziele, Kosten & DauerEUR 5,85 für den Versand von Vereinigtes Königreich nach Deutschland
Versandziele, Kosten & DauerAnbieter: Buchpark, Trebbin, Deutschland
Zustand: Gut. Zustand: Gut | Seiten: 311 | Sprache: Englisch | Produktart: Bücher. Artikel-Nr. 3325770/3
Anzahl: 3 verfügbar
Anbieter: Emile Kerssemakers ILAB, Heerlen, Niederlande
Original hardcover. xii,(2),312 pp.; 24x16 cm. " Physics of Neural Networks " Text in English. - Very good, see picture 640g. Artikel-Nr. 73693
Anzahl: 1 verfügbar
Anbieter: Ria Christie Collections, Uxbridge, Vereinigtes Königreich
Zustand: New. In. Artikel-Nr. ria9780387943688_new
Anzahl: Mehr als 20 verfügbar
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Gebundene Ausgabe. Zustand: Sehr gut. Gebraucht - Sehr gut SG -leichte Beschädigungen oder Verschmutzungen, ungelesenes Mängelexemplar, gestempelt - One of the most challenging and fascinating problems of the theory of neural nets is that of asymptotic behavior, of how a system behaves as time proceeds. This is of particular relevance to many practical applications. Here we focus on association, generalization, and representation. We turn to the last topic first. The introductory chapter, 'Global Analysis of Recurrent Neural Net works,' by Andreas Herz presents an in-depth analysis of how to construct a Lyapunov function for various types of dynamics and neural coding. It includes a review of the recent work with John Hopfield on integrate-and fire neurons with local interactions. The chapter, 'Receptive Fields and Maps in the Visual Cortex: Models of Ocular Dominance and Orientation Columns' by Ken Miller, explains how the primary visual cortex may asymptotically gain its specific structure through a self-organization process based on Hebbian learning. His argu ment since has been shown to be rather susceptible to generalization. Artikel-Nr. INF1000503840
Anzahl: 1 verfügbar
Anbieter: AHA-BUCH GmbH, Einbeck, Deutschland
Buch. Zustand: Neu. Neuware - One of the most challenging and fascinating problems of the theory of neural nets is that of asymptotic behavior, of how a system behaves as time proceeds. This is of particular relevance to many practical applications. Here we focus on association, generalization, and representation. We turn to the last topic first. The introductory chapter, 'Global Analysis of Recurrent Neural Net works,' by Andreas Herz presents an in-depth analysis of how to construct a Lyapunov function for various types of dynamics and neural coding. It includes a review of the recent work with John Hopfield on integrate-and fire neurons with local interactions. The chapter, 'Receptive Fields and Maps in the Visual Cortex: Models of Ocular Dominance and Orientation Columns' by Ken Miller, explains how the primary visual cortex may asymptotically gain its specific structure through a self-organization process based on Hebbian learning. His argu ment since has been shown to be rather susceptible to generalization. Artikel-Nr. 9780387943688
Anzahl: 2 verfügbar