Textbook
Memory and the Computational Brain: Why Cognitive Science will Transform NeuroscienceISBN: 978-1-4051-2288-7
Paperback
336 pages
April 2009, ©2009, Wiley-Blackwell
This is a Print-on-Demand title. It will be printed specifically to fill your order. Please allow an additional 10-15 days delivery time. The book is not returnable.
Other Available Formats: Hardcover
|
Preface.
1. Information.
Shannon’s Theory of Communication.
Measuring Information.
Efficient Coding.
Information and the Brain.
Digital and Analog Signals.
Appendix: The Information Content of Rare Versus Common Events and Signals.
2. Bayesian Updating.
Bayes’ Theorem and Our Intuitions About Evidence.
Using Bayes’ Rule.
Summary.
3. Functions.
Functions of One Argument.
Composition and Decomposition of Functions.
Functions of More than One Argument.
The Limits to Functional Decomposition.
Functions Can Map to Multi-Part Outputs.
Mapping to Multiple-Element Outputs Does Not Increase Expressive Power.
Defining Particular Functions.
Summary: Physical/Neurobiological Implications of Facts about Functions.
4. Representations.
Some Simple Examples.
Notation.
The Algebraic Representation of Geometry.
5. Symbols.
Physical Properties of Good Symbols.
Symbol Taxonomy.
Summary.
6. Procedures.
Algorithms.
Procedures, Computation, and Symbols.
Coding and Procedures.
Two Senses of Knowing.
A Geometric Example.
7. Computation.
Formalizing Procedures.
The Turing Machine.
Turing Machine for the Successor Function.
Turing Machines for ƒ is _even
Turing Machines for ƒ+
Minimal Memory Structure.
General Purpose Computer.
Summary.
8. Architectures.
One-Dimensional Look-Up Tables (If-Then Implementation).
Adding State Memory: Finite-State Machines.
Adding Register Memory.
Summary.
9. Data Structures.
Finding Information in Memory.
An Illustrative Example.
Procedures and the Coding of Data Structures.
The Structure of the Read-Only Biological Memory.
10. Computing with Neurons.
Transducers and Conductors.
Synapses and the Logic Gates.
The Slowness of It All.
The Time-Scale Problem.
Synaptic Plasticity.
Recurrent Loops in Which Activity Reverberates.
11. The Nature of Learning.
Learning As Rewiring.
Synaptic Plasticity and the Associative Theory of Learning.
Why Associations Are Not Symbols.
Distributed Coding.
Learning As the Extraction and Preservation of Useful Information.
Updating an Estimate of One’s Location.
12. Learning Time and Space.
Computational Accessibility.
Learning the Time of Day.
Learning Durations.
Episodic Memory.
13. The Modularity of Learning.
Example 1: Path Integration.
Example 2: Learning the Solar Ephemeris.
Example 3: “Associative” Learning.
Summary.
14. Dead Reckoning in a Neural Network.
Reverberating Circuits as Read/Write Memory Mechanisms.
Implementing Combinatorial Operations by Table-Look-Up.
The Full Model.
The Ontogeny of the Connections?
How Realistic is the Model?
Lessons to be Drawn.
Summary.
15. Neural Models of Interval Timing.
Timing an Interval on First Encounter.
Dworkin’s Paradox.
Neurally Inspired Models.
The Deeper Problems.
16. The Molecular Basis of Memory.
The Need to Separate Theory of Memory from Theory of Learning.
The Coding Question.
A Cautionary Tale.
Why Not Synaptic Conductance?
A Molecular or Sub-Molecular Mechanism?
Bringing the Data to the Computational Machinery.
Is It Universal?
References.
Glossary.
Index.
1. Information.
Shannon’s Theory of Communication.
Measuring Information.
Efficient Coding.
Information and the Brain.
Digital and Analog Signals.
Appendix: The Information Content of Rare Versus Common Events and Signals.
2. Bayesian Updating.
Bayes’ Theorem and Our Intuitions About Evidence.
Using Bayes’ Rule.
Summary.
3. Functions.
Functions of One Argument.
Composition and Decomposition of Functions.
Functions of More than One Argument.
The Limits to Functional Decomposition.
Functions Can Map to Multi-Part Outputs.
Mapping to Multiple-Element Outputs Does Not Increase Expressive Power.
Defining Particular Functions.
Summary: Physical/Neurobiological Implications of Facts about Functions.
4. Representations.
Some Simple Examples.
Notation.
The Algebraic Representation of Geometry.
5. Symbols.
Physical Properties of Good Symbols.
Symbol Taxonomy.
Summary.
6. Procedures.
Algorithms.
Procedures, Computation, and Symbols.
Coding and Procedures.
Two Senses of Knowing.
A Geometric Example.
7. Computation.
Formalizing Procedures.
The Turing Machine.
Turing Machine for the Successor Function.
Turing Machines for ƒ is _even
Turing Machines for ƒ+
Minimal Memory Structure.
General Purpose Computer.
Summary.
8. Architectures.
One-Dimensional Look-Up Tables (If-Then Implementation).
Adding State Memory: Finite-State Machines.
Adding Register Memory.
Summary.
9. Data Structures.
Finding Information in Memory.
An Illustrative Example.
Procedures and the Coding of Data Structures.
The Structure of the Read-Only Biological Memory.
10. Computing with Neurons.
Transducers and Conductors.
Synapses and the Logic Gates.
The Slowness of It All.
The Time-Scale Problem.
Synaptic Plasticity.
Recurrent Loops in Which Activity Reverberates.
11. The Nature of Learning.
Learning As Rewiring.
Synaptic Plasticity and the Associative Theory of Learning.
Why Associations Are Not Symbols.
Distributed Coding.
Learning As the Extraction and Preservation of Useful Information.
Updating an Estimate of One’s Location.
12. Learning Time and Space.
Computational Accessibility.
Learning the Time of Day.
Learning Durations.
Episodic Memory.
13. The Modularity of Learning.
Example 1: Path Integration.
Example 2: Learning the Solar Ephemeris.
Example 3: “Associative” Learning.
Summary.
14. Dead Reckoning in a Neural Network.
Reverberating Circuits as Read/Write Memory Mechanisms.
Implementing Combinatorial Operations by Table-Look-Up.
The Full Model.
The Ontogeny of the Connections?
How Realistic is the Model?
Lessons to be Drawn.
Summary.
15. Neural Models of Interval Timing.
Timing an Interval on First Encounter.
Dworkin’s Paradox.
Neurally Inspired Models.
The Deeper Problems.
16. The Molecular Basis of Memory.
The Need to Separate Theory of Memory from Theory of Learning.
The Coding Question.
A Cautionary Tale.
Why Not Synaptic Conductance?
A Molecular or Sub-Molecular Mechanism?
Bringing the Data to the Computational Machinery.
Is It Universal?
References.
Glossary.
Index.