Verwandte Artikel zu Principles of Neural Information Theory: Computational...

Principles of Neural Information Theory: Computational Neuroscience and Metabolic Efficiency (Tutorial Introductions) - Hardcover

 
9780993367960: Principles of Neural Information Theory: Computational Neuroscience and Metabolic Efficiency (Tutorial Introductions)

Inhaltsangabe

Páginas: 214 Géneros: 12:MJN:Neurology & clinical neurophysiology 12:UYZM:Information architecture 12:PSAN:Neurosciences Sinopsis: The brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon',s mathematical theory of information is used to explore the metabolic efficiency of neurons, with special reference to visual perception. Evidence from a diverse range of research papers is used to show how information theory defines absolute limits on neural efficiency, limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style, with a comprehensive glossary, tutorial appendices, explainer boxes, and a list of annotated Further Readings, this book is an ideal introduction to cutting-edge research in neural information theory. ,_,

Die Inhaltsangabe kann sich auf eine andere Ausgabe dieses Titels beziehen.

Críticas

This is a terrific book, which cannot fail to help any student who wants to understand precisely how energy and information constrain neural design. The tutorial approach adopted makes it more like a novel than a textbook. Consequently, both mathematically sophisticated readers and readers who prefer verbal explanations should be able to understand the material. Overall, Stone has managed to weave the disparate strands of neuroscience, psychophysics, and Shannon’s theory of communication into a coherent account of neural information theory. I only wish I'd had this text as a student!

Peter Sterling, Professor of Neuroscience, University of Pennsylvania, USA.

 

"Essential reading for any student of the {\em why} of neural coding: why do neurons send signals they way they do? Stone's insightful, clear, and eminently readable synthesis of classic studies is a gateway to a rich, glorious literature on the brain. Student and professor alike will find much to spark their minds within. I shall be keeping this wonderful book close by, as a sterling reminder to ask not just how brains work, but why."

Professor Mark Humphries, School of Psychology, University of Nottingham, UK.
 

"This excellent book provides an accessible introduction to an information theoretic perspective on how the brain works, and (more importantly) why it works that way. Using a wide range of examples, including both structural and functional aspects of brain organisation, Stone describes how simple optimisation principles derived from Shannon's information theory predict physiological parameters (e.g. axon diameter) with remarkable accuracy. These principles are distilled from original research papers, and the informal presentation style means that the book can be appreciated as an overview; but full mathematical details are also provided for dedicated readers. Stone has integrated results from a diverse range of experiments, and in so doing has produced an invaluable introduction to the nascent field of neural information theory. "

Dr Robin Ince, Centre for Cognitive Neuroimaging, Institute of Neuroscience and Psychology, University of Glasgow, UK.

 

Reseña del editor

The brain is the most complex computational machine known to science, even though its components (neurons) are slow and unreliable compared to a laptop computer. In this richly illustrated book, Shannon's mathematical theory of information is used to explore the metabolic efficiency of neurons, with special reference to visual perception. Evidence from a diverse range of research papers is used to show how information theory defines absolute limits on neural efficiency; limits which ultimately determine the neuroanatomical microstructure of the eye and brain. Written in an informal style, with a comprehensive glossary, tutorial appendices, explainer boxes, and a list of annotated Further Readings, this book is an ideal introduction to cutting-edge research in neural information theory. 

„Über diesen Titel“ kann sich auf eine andere Ausgabe dieses Titels beziehen.

  • VerlagTutorial Introductions
  • Erscheinungsdatum2018
  • ISBN 10 0993367968
  • ISBN 13 9780993367960
  • EinbandTapa dura
  • SpracheEnglisch
  • Anzahl der Seiten214

EUR 14,23 für den Versand von Vereinigtes Königreich nach USA

Versandziele, Kosten & Dauer

Weitere beliebte Ausgaben desselben Titels

9780993367922: Principles of Neural Information Theory: Computational Neuroscience and Metabolic Efficiency (Tutorial Introductions)

Vorgestellte Ausgabe

ISBN 10:  0993367925 ISBN 13:  9780993367922
Verlag: Sebtel Press, 2018
Softcover

Suchergebnisse für Principles of Neural Information Theory: Computational...

Beispielbild für diese ISBN

Stone, James V
Verlag: Tutorial Introductions, 2018
ISBN 10: 0993367968 ISBN 13: 9780993367960
Neu Hardcover

Anbieter: Ria Christie Collections, Uxbridge, Vereinigtes Königreich

Verkäuferbewertung 5 von 5 Sternen 5 Sterne, Erfahren Sie mehr über Verkäufer-Bewertungen

Zustand: New. In. Artikel-Nr. ria9780993367960_new

Verkäufer kontaktieren

Neu kaufen

EUR 125,78
Währung umrechnen
Versand: EUR 14,23
Von Vereinigtes Königreich nach USA
Versandziele, Kosten & Dauer

Anzahl: Mehr als 20 verfügbar

In den Warenkorb