Memory and the Computational Brain offers a provocative argument that goes to the heart of neuroscience, proposing that the field can and should benefit from the recent advances of cognitive science and the development of information theory over the course of the last several decades.
Die Inhaltsangabe kann sich auf eine andere Ausgabe dieses Titels beziehen.
C. R. Gallistel is Co-Director of the Rutgers Center for Cognitive Science. He is one of the foremost psychologists working on the foundations of cognitive neuroscience. His publications include The Symbolic Foundations of Conditional Behavior (2002), and The Organization of Learning (1990).
Adam Philip King is Assistant Professor of Mathematics at Fairfield University.
Memory and the Computational Brain spans the fields of cognitive science, linguistics, psychology, neuroscience, and education, to suggest new perspectives on the way we consider learning mechanisms in the brain.
Gallistel and King propose that the architecture of the brain is structured precisely for learning and for memory, and that the concept of an addressable read/write memory mechanism should be integrated into the foundations of neuroscience. They argue that the field of neuroscience can and should benefit from the recent advances of cognitive science and the development of information theory over the recent decades. Based on three lectures given by Randy Gallistel in the prestigious Blackwell-Maryland Lectures in Language and Cognition, the text has been significantly revised and expanded with numerous interdisciplinary examples and models and reflects recent research to make it essential reading for both students and those working in the field.
Memory and the Computational Brain spans the fields of cognitive science, linguistics, psychology, neuroscience, and education, to suggest new perspectives on the way we consider learning mechanisms in the brain.
Gallistel and King propose that the architecture of the brain is structured precisely for learning and for memory, and that the concept of an addressable read/write memory mechanism should be integrated into the foundations of neuroscience. They argue that the field of neuroscience can and should benefit from the recent advances of cognitive science and the development of information theory over the recent decades. Based on three lectures given by Randy Gallistel in the prestigious Blackwell-Maryland Lectures in Language and Cognition, the text has been significantly revised and expanded with numerous interdisciplinary examples and models and reflects recent research to make it essential reading for both students and those working in the field.
Most cognitive scientists think about the brain and behavior within an information-processing framework: Stimuli acting on sensory receptors provide information about the state of the world. The sensory receptors transduce the stimuli into neural signals, streams of action potentials (aka spikes). The spike trains transmit the information contained in the stimuli from the receptors to the brain, which processes the sensory signals in order to extract from them the information that they convey. The extracted information may be used immediately to inform ongoing behavior, or it may be kept in memory to be used in shaping behavior at some later time. Cognitive scientists seek to understand the stages of processing by which information is extracted, the representations that result, the motor planning processes through which the information enters into the direction of behavior, the memory processes that organize and preserve the information, and the retrieval processes that find the information in memory when it is needed. Cognitive neuroscientists want to understand where these different aspects of information processing occur in the brain and the neurobiological mechanisms by which they are physically implemented.
Historically, the information-processing framework in cognitive science is closely linked to the development of information technology, which is used in electronic computers and computer software to convert, store, protect, process, transmit, and retrieve information. But what exactly is this "information" that is so central to both cognitive science and computer science? Does it have a rigorous meaning? In fact, it does. Moreover, the conceptual system that has grown up around this rigorous meaning - information theory - is central to many aspects of modern science and engineering, including some aspects of cognitive neuroscience. For example, it is central to our emerging understanding of how neural signals transmit information about the ever-changing state of the world from sensory receptors to the brain (Rieke, Warland, de Ruyter van Steveninck, & Bialek, 1997). For us, it is an essential foundation for our central claim, which is that the function of the neurobiological memory mechanism is to carry information forward in time in a computationally accessible form.
Shannon's Theory of Communication
The modern quantitative understanding of information rests on the work of Claude Shannon. A telecommunications engineer at Bell Laboratories, he laid the mathematical foundations of information theory in a famous paper published in 1948, at the dawn of the computer age (Shannon, 1948). Shannon's concern was understanding communication (the transmission of information), which he schematized as illustrated in Figure 1.1.
The schematic begins with an information source. The source might be a person who hands in a written message at a telegraph office. Or, it might be an orchestra playing a Beethoven symphony. In order for the message to be communicated to you, you must receive a signal that allows you to reconstitute the message. In this example, you are the destination of the message. Shannon's analysis ends when the destination has received the signal and reconstituted the message that was present at the source.
The transmitter is the system that converts the messages into transmitted signals, that is, into fluctuations of a physical quantity that travels from a source location to a receiving location and that can be detected at the receiving location. Encoding is the process by which the messages are converted into transmitted signals. The rules governing or specifying this conversion are the code. The mechanism in the transmitter that implements the conversion is the encoder.
Following Shannon, we will continue to use two illustrative examples, a telegraphic communication and a symphonic broadcast. In the telegraphic example, the source messages are written English phrases handed to the telegrapher, for example, "Arriving tomorrow, 10 am." In the symphonic example, the source messages are sound waves arriving at a microphone. Any one particular short message written in English and handed to a telegraph operator can be thought of as coming from a finite set of possible messages. If we stipulate a maximum length of, say, 1,000 characters, with each character being one of 45 or so different characters (26 letters, 10 digits, and punctuation marks), then there is a very large but finite number of possible messages. Moreover, only a very small fraction of these messages are intelligible English, so the size of the set of possible messages - defined as intelligible English messages of 1,000 characters or less - is further reduced. It is less clear that the sound waves generated by an orchestra playing Beethoven's Fifth can be conceived of as coming from a finite set of messages. That is why Shannon chose this as his second example. It serves to illustrate the generality of his theory.
In the telegraphy example, the telegraph system is the transmitter of the messages. The signals are the short current pulses in the telegraph wire, which travel from the sending key to the sounder at the receiving end. The encoder is the telegraph operator. The code generally used is the Morse code. This code uses pulses of two different durations to encode the characters - a short mark (dot), and a long mark (dash). It also uses four different inter-pulse intervals for separations - an intra-character gap (between the dots and dashes within characters), a short gap (between the letters), a medium gap (between words), and a long gap (between sentences).
In the orchestral example, the broadcast system transmitting radio signals from the microphone to your radio is the transmitter. The encoder is the electronic device that converts the sound waves into electromagnetic signals. The type of code is likely to be one of three different codes that have been used in the history of radio (see Figure 1.2), all of which are in current use. All of them vary a parameter of a high- frequency sinusoidal carrier signal. The earliest code was the AM (amplitude modulated) code. In this code, the encoder modulates the amplitude of the carrier signal so that this amplitude of the sinusoidal carrier signal varies in time in a way that closely follows the variation in time of the sound pressure at the microphone's membrane.
When the FM (frequency modulated) code is used, the encoder modulates the frequency of the carrier signal within a limited range. When the digital code is used, as it is in satellite radio, parameters of the carrier frequency are modulated so as to implement a binary code, a code in which there are only two characters, customarily called the '0' and the '1' character. In this system, time is divided into extremely short intervals. During any one interval, the carrier signal is either low ('0') or high ('1'). The relation between the sound wave arriving at the microphone with its associated encoding electronics and the transmitted binary signal is not easily described, because the encoding system is a sophisticated one that makes use of what we have learned about the statistics of broadcast messages to create efficient codes. The development of these codes rests on the...
„Über diesen Titel“ kann sich auf eine andere Ausgabe dieses Titels beziehen.
Anbieter: PBShop.store UK, Fairford, GLOS, Vereinigtes Königreich
HRD. Zustand: New. New Book. Shipped from UK. Established seller since 2000. Artikel-Nr. FW-9781405122870
Anzahl: 15 verfügbar
Anbieter: Majestic Books, Hounslow, Vereinigtes Königreich
Zustand: New. pp. 336. Artikel-Nr. 6804173
Anzahl: 3 verfügbar
Anbieter: Revaluation Books, Exeter, Vereinigtes Königreich
Hardcover. Zustand: Brand New. 1st edition. 319 pages. 10.00x7.00x1.00 inches. In Stock. Artikel-Nr. __1405122870
Anzahl: 2 verfügbar
Anbieter: Kennys Bookstore, Olney, MD, USA
Zustand: New. Memory and the Computational Brain offers a provocative argument that goes to the heart of neuroscience, proposing that the field can and should benefit from the recent advances of cognitive science and the development of information theory over the course of the last several decades. Series: Blackwell/Maryland Lectures in Language and Cognition. Num Pages: 336 pages, Illustrations. BIC Classification: JMRM; MJN. Category: (P) Professional & Vocational. Dimension: 252 x 176 x 24. Weight in Grams: 740. . 2009. 1st Edition. Hardcover. . . . . Books ship from the US and Ireland. Artikel-Nr. V9781405122870
Anzahl: Mehr als 20 verfügbar