副标题:无

作   者:

分类号:

ISBN:9780585375878

微信扫一扫,移动浏览光盘

简介

Summary: Publisher Summary 1 Much research focuses on the question of how information is processed in nervous systems, from the level of individual ionic channels to large-scale neuronal networks, and from "simple" animals such as sea slugs and flies to cats and primates. New interdisciplinary methodologies combine a bottom-up experimental methodology with the more top-down-driven computational and modeling approach. This book serves as a handbook of computational methods and techniques for modeling the functional properties of single and groups of nerve cells.The contributors highlight several key trends: (1) the tightening link between analytical/numerical models and the associated experimental data, (2) the broadening of modeling methods, at both the subcellular level and the level of large neuronal networks that incorporate real biophysical properties of neurons as well as the statistical properties of spike trains, and (3) the organization of the data gained by physical emulation of the nervous system components through the use of very large scale circuit integration (VLSI) technology.The field of neuroscience has grown dramatically since the first edition of this book was published nine years ago. Half of the chapters of the second edition are completely new; the remaining ones have all been thoroughly revised. Many chapters provide an opportunity for interactive tutorials and simulation programs. They can be accessed via Christof Koch's Website.Contributors : Larry F. Abbott, Paul R. Adams, Hagai Agmon-Snir, James M. Bower, Robert E. Burke, Erik de Schutter, Alain Destexhe, Rodney Douglas, Bard Ermentrout, Fabrizio Gabbiani, David Hansel, Michael Hines, Christof Koch, Misha Mahowald, Zachary F. Mainen, Eve Marder, Michael V. Mascagni, Alexander D. Protopapas, Wilfrid Rall, John Rinzel, Idan Segev, Terrence J. Sejnowski, Shihab Shamma, Arthur S. Sherman, Paul Smolen, Haim Sompolinsky, Michael Vanier, Walter M. Yamada.  

目录


Methods in Neuronal Modeling
Contents
Series Foreword
Preface
1— Kinetic Models of Synaptic Transmission
1.1— Introduction: The Kinetic Interpretation of Ion Channel Gating
1.2— Presynaptic Mechanisms of Transmitter Release
1.2.1— Model of Transmitter Release
1.2.2— Further Simplification of the Release Process
1.3— Markov Models of Postsynaptic Currents
1.3.1— AMPA/Kainate Receptors
1.3.2— NMDA Receptors
1.3.3— GABA[Sub(A)] Receptors
1.3.4— GABA[Sub(B)] Receptors
1.3.5— Other Neuromodulators
1.4— Simplified Models of Postsynaptic Currents
1.4.1— AMPA/Kainate Receptors
1.4.2— NMDA Receptors
1.4.3— GABA[Sub(A)] Receptors
1.4.4— GABA[Sub(B)] Receptors and Neuromodulators
1.5— Implementation
1.5.1— Synaptic Summation
1.5.2— Connecting Networks
Acknowledgments
Appendix A— Kinetic Models of Gating Mechanisms
Appendix B— Fitting Kinetic Models to Experimental Data
Appendix C— Optimized Algorithms
Single Synapse
Multiple Synapses
Appendix D— Tutorials for Implementing Network Simulations
2— Cable Theory for Dendritic Neurons
2.1— Background
2.2— The Cable Equation
2.2.1— Definitions
2.2.2— Assumptions and Derivation for Cable Equation
2.2.3— Boundary Conditions for the Cable Equation
2.3— Steady-State Solutions for a Passive Uniform Cable
2.3.1— The Importance of the Steady-State Case
2.3.2— General Solution of Ordinary Differential Equation
2.3.3— Steady-State Solutions for Different Boundary Conditions
2.3.4— Input Conductance and Input Resistance
2.4— Steady-State Solutions for Passive Dendritic Trees
2.4.1— Definitions and Concepts
The Concept of Directional Properties at a Point
2.4.2— Properties of the Steady-State Voltage in Passive Dendritic Trees
Linearity Property
Reciprocity Property
Attenuation Rate Property
Attenuation Factor Properties
Input Conductance Properties
Attenuation Cost and Voltage Clamp Properties
2.4.3— Calculating the Input Conductance and Attenuation in a Dendritic Tree with Arbitrary Branchin...
Input Conductance at the Origin of a Dendritic Tree
Attenuation from the Origin of a Dendritic Tree
Algorithm that Computes Input Conductance at Every Branch Point and Attenuation Factor along all Bra...
Computation Time and Extensions of the Algorithms
2.4.4— The Effective Length Constant and the Morphoelectrotonic Transform
2.5— Family of Trees Related to an Equivalent Cylinder
2.5.1— Definition and Properties
2.5.2— Dendritic Surface Area and Input Conductance of Tree
2.5.3— Input Conductance of a Neuron
2.5.4— Steady-State Decrement of Voltage with Distance
2.6— Transient Solutions and Properties
2.6.1— Two Classes of Explicit Solutions
2.6.2— Voltage Decay Transient for Cylinder of Finite Length with Sealed Ends
2.6.3— Lumped Soma Coupled to Cylinders
2.6.4— Using the Laplace Transform for Analyzing the Transient Solution
2.6.5— The Method of Moments
Strength
Characteristic Time
2.7— Synaptic Input as a Conductance Change
2.7.1— Formal Representation of Synaptic Excitation and Inhibition
2.7.2— Synaptic Excitation Distributed over Half of an Equivalent Cylinder
2.8— Insights Gained from Compartmental Computations
2.8.1— Effect of Synaptic Input Location on the Excitatory Postsynaptic Potential Shape at the Soma
2.8.2— Effect of Spatiotemporal Pattern of Synaptic Input
2.8.3— Effect of Synaptic Inhibitory Input Location
2.9— Insights Gained from Other Cable Computations
2.9.1— Transients at Different Locations in a Dendritic Tree for Input to One Branch
2.9.2— Computation of Field Potentials in Olfactory Bulb
2.9.3— Voltage Clamp at the Soma of a Dendritic Neuron
3— Compartmental Models of Complex Neurons
3.1— Introduction
3.2— Principles of Compartmental Neurons Models
3.2.1— Overview
3.2.2— Mathematical Formulation
3.2.3— Membrane Models
3.2.4— Methods and Approaches
3.3— Embodying Neuronal Morphology in Compartmental Models
3.3.1— Encoding Neuron Structure
3.3.2— Neuron Simulation
3.3.3— Estimation of Passive Membrane Parameters
3.3.4— Current Flow in Cylindrical Compartments: Normalized Conductance Ratios
3.3.5— Steady-State Input Resistance of Dendrites and Neurons
3.3.6— Steady-State Attenuation
3.3.7— Matching Experimental and Simulated Transient Responses
3.4— Reduced Models of Neurons
3.4.1— Reducing Geometrical Complexity: "Equivalent Cables"
3.4.2— Pinski and Rinzel Reduced Model for Hippocampal CA3 Neurons
3.4.3— Dendritic Spines and Massive Synaptic Inputs
3.5— Discussion
3.5.1— The Issue of Model Complexity
3.5.2— Passive versus Active Membrane
3.5.3— Conclusions
Acknowledgments
Appendix: The Neurosimulator NEURON
Single-Compartment Simulations
Simulations of Three-Dimensional Reconstructed Cells
Notes
4— Multiple Channels and Calcium Dynamics
4.1— Introduction
4.2— Modeling Ionic Current Flow
4.3— Inward Currents
4.4— Outward Currents
4.5— Synaptic Input
4.6— Calcium Diffusion and Buffering
4.6.1— Calcium Current
4.6.2— Calcium Diffusion
4.6.3— Calcium Buffers
4.6.4— Calcium Pumps
4.7— Potassium Accumulation
4.8— Integration
4.8.1— Voltage Update
4.8.2— Calcium Update
4.8.3— Variable Time Step
4.9— Results
4.10— Discussion
Appendix: Modeling Bullfrog Sympathetic Ganglion Cells
Principal Equation
Fast Sodium Current
Fast Calcium Current
Transient, Outward Potassium Current
Noninactivating Muscarinic Potassium Current
Delayed, Rectifying Potassium Current
Noninactivating Calcium-Dependent Potassium Current
Voltage-Independent, Calcium-Dependent Potassium Current
Passive Components
Fast, Nicotinic Synaptic Input
5— Modeling Active Dendritic Processes in Pyramidal Neurons
5.1— Introduction
5.2— Passive Cable Models
5.2.1— Passive Electrical Structure
Geometry
Passive Properties
5.2.2— Active Channels
5.2.3— Temperature Dependence
5.2.4— Density Estimation
Anatomical Density Estimation
Whole-Cell Recording
Single-Channel Recording
Imaging
Na[Sup(+)] Imaging
Membrane Potential Imaging
5.2.5— Channel Types
General Observations
Sodium Channels
Kinetics
Distribution
Calcium Channels
Subtypes
Distribution
Potassium Channels
Fast Spikr Repolarization
Spike Frequency Adaptation
Inward Rectification
5.2.6— Axonal Structure and Function
Anatomy
Sodium Channels
Other Properties
Propagation
5.2.7— Exploring Parameters
5.3— Applications
5.3.1— Synaptic Integration
5.3.2— Spike Initiation
5.3.3— Intrinsic Firing Patterns
5.4— Analysis
5.4.1— Reduced Models
5.4.2— Current-Voltage Curves
5.4.3— Phase Plane Analysis
Internet Resources
Acknowledgments
6— Calcium Dynamics in Large Neuronal Models
6.1— Introduction
6.2— Phenomenological Models of Calcium Dynamics
6.2.1— The Simple Pool Model of Calcium Concentration
6.2.2— A Model of Synaptic Transmitter Release
6.3— Rectifying Calcium Channels and Pumps
6.3.1— Goldman-Hodgkin-Katz Equation
6.3.2— Calcium Pumps
Na[Sup(+)]-Ca[Sup(2+)] Exchanger
Purkinje Cell Model
Tabulated Equations
6.4— Diffusion of Calcium
6.4.1— One-Dimensional Diffusion in Cylinders and Spheres
Compartmental Models of Neurons
Modeling Spines
Examples of Spine Models
6.4.2— Multidimensional Diffusion
Alternating Direction Implicit Method
Domain Model
6.5— Electrodiffusion Models
6.5.1— Description
6.5.2— Applicability and Examples
6.6— Buffer Capacity and Buffer Diffusion
6.6.1— Buffer Capacity
6.6.2— Buffer Diffusion
6.6.3— Purkinje Cell Model
Results
6.6.4— Modeling Calcium Indicator Dyes
Purkinje Cell Model
6.6.5— Predictions from Linearized Calcium Theory
6.7— Uptake and Release from Calcium Stores
6.7.1— Calcium Uptake and CICR
6.7.2— IP[Sub(3)]-Induced Calcium Release
IP[Sub(3)] Concentration
6.7.3— The Complete Model of Release from Stores
Purkinje Cell Model
6.8— Conclusions
Acknowledgments
Note
Appendix A: Purkinje Cell Model Description
Appendix B: Parameters for Calcium Stores
7— Analysis of Neural Excitability and Oscillations
7.1— Introduction
7.2— Models for Excitable Cells and Networks
7.3— Understanding Dynamics via Phase Plane Analysis
7.3.1— The Geometry of Excitability
7.3.2— Oscillations Emerging with Nonzero Frequency
7.3.3— Oscillations Emerging with Zero Frequency
7.3.4— More Bistability
7.4— Bursting and Adaptation: Spiking Dynamics with Slow Modulation
7.4.1— Square-Wave Bursters
7.4.2— Chaos and Poincaré Maps
7.4.3— Elliptic Bursters
7.4.4— Parabolic Bursting: Two Slow Processes
7.5— Phase-Resetting and Phase-Locking of Oscillators
7.5.1— Phase Response Curves
7.5.2— Averaging and Weak Coupling
7.6— Summary
Acknowledgment
Appendix A— Morris-Lecar Equations
Appendix B: Numerical Methods
8— Design and Fabrication of Analog VLSI Neurons
8.1— Introduction
8.2— Mapping Neurons into aVLSI
8.3— Active Channels
8.4— The Action Potential
8.5— Design and Fabrication
8.6— Conclusion
Acknowledgments
Notes
9— Principles of Spike Train Analysis
9.1— Introduction
9.2— Models
9.2.1— Perfect Integrate-and-Fire Neuron
9.2.2— Refractory Period
9.2.3— Leaky Integrate-and-Fire Neuron
9.2.4— Poisson Spike Trains and Integrate-and-Fire Neurons with Random Threshold
9.3— Interspike Interval Distribution and Coefficient of Variation
9.4— Spike Count Distribution and Fano Factor
9.4.1— Relationship between Coefficient of Variation and Fano Factor
9.4.2— Relationship between Fano Factor and the Autocorrelation Function
9.5— Signal Detection and Receiver Operating Characteristic Analysis
9.6— Autocorrelation and Power Spectrum
9.6.1— The Autocorrelation Function
9.6.2— The Power Spectrum
9.6.3— Spike Train Analysis of Linear Encoding Systems
9.7— Wiener Kernels and Stimulus Estimation
9.7.1— First-Order Wiener Kernel and Reverse Correlation
9.7.2— Nonlinear Encoding and Higher Kernels
9.7.3— Stimulus Estimation and Reliability of Encoding
9.7.4— More General Estimation Techniques
9.7.5— Nonlinear Encoding and Stimulus Estimation
Acknowledgments
Appendix A— Numerical Estimation Methods
Mean and Variance of the Interspike Interval Distribution
Mean and Variance of the Spike Count
Power Spectrum and Autocorrelation of the Spike Train
First-Order Wiener Kernel, Wiener-Kolmogorov-Filtering
Appendix B— MATLAB Interface and Routines
Location
System Requirements
Software and Data
Notes
10— Modeling Small Networks
10.1— Introduction
10.2— The Pyloric Network of the Stomatogastric Ganglion
10.3— Conductance-Based Models
10.3.1— A Conductance-Based Model of the LP Neuron
10.3.2— Reduction of Conductance-Based Models
10.3.3— Multicompartment Models
10.3.4— Beyond the Hodgkin-Huxley Model: Channels and Conductances
10.3.5— State-Dependent Inactivation of the Kv1.3 Conductance
10.4— Problems with Conductance-Based Models
10.4.1— The Dynamic Clamp Technique
10.4.2— Single-Neuron "Short-Term Memory" Effects from Kv1.3
10.4.3— Activity-Dependent Conductances
10.5— Synaptic Subcircuits
10.5.1— Modeling Electrical Synapses
10.5.2— Modeling Chemical Synaptic Transmission
10.5.3— Modeling Facilitation and Depression
10.5.4— Study of a Reciprocally Inhibitory Oscillator
10.5.5— Other Examples of Simulated Synaptic Conductances
10.6— Neuromodulation of Central Pattern Generators
10.6.1— The Effects of Dopamine on the Pyloric Rhythm
10.6.2— Dynamic Clamp Modeling of the Effects of Proctolin
10.7— Current Problems and Issues
Acknowledgments
11— Spatial and Temporal Processing in Central Auditory Networks
11.1— Introduction
11.2— The Mammalian Auditory System
11.2.1— The Spectral Estimation Problem
11.2.2— The Spectral Analysis Problem
11.3— The Single-Neuron Model
11.4— Neural Networks for Spectral Estimation
11.4.1— The Mean Rate Hypothesis
11.4.2— The Periodicity Hypothesis: Neural Networks for Temporal Processing
An Example of a Network Implementation
11.5— Neural Networks for Spatial Processing: Lateral Inhibitory Networks
11.5.1— Analysis of the Nonrecurrent Lateral Inhibitory Network
11.5.2— Analysis of the Recurrent Lateral Inhibitory Network
11.5.3— Spatial Processing with the Lateral Inhibitory Network: Edge Detection and Peak Selection
11.5.4— Temporal Processing with Lateral Inhibitory Network: Onset Sharpening and Oscillations
11.5.5— Processing with More Elaborate Lateral Inhibitory Network Models
11.5.6— Other Formulations for Early Auditory Processing
11.6— Implementations of Lateral Inhibitory Networks
11.6.1— Simulating Nonrecurrent Lateral Inhibitory Networks
11.6.2— Simulating Nonlinear Recurrent Lateral Inhibitory Networks
11.6.3— Summary of the Lateral Inhibitory Network Processing of Auditory Patterns
11.7— Cortical Representation of the Spectral Profile: The Spectral Analysis Problem
11.7.1— Mathematical Formulation of the Cortical Model
The Cortical Model
Linearity of the Cortical Model
The Dynamic Cortical Model
11.7.2— Measuring the Response Field with Stationary Ripples
Validating the Linear Cortical Model
11.7.3— Measuring Dynamic Response Fields Using Moving Ripples
Theoretical Framework
Validating the Dynamic Cortical Model
11.7.4— Response Nonlinearities
11.7.5— Summary
11.8— The Biological Plausibility of a Neural Network Model
Notes
12— Simulating Large Networks of Neurons
12.1— Introduction
12.2— General Issues
12.2.1— The GENESIS Neural Simulator
12.2.2— Realistic Modeling and Questions of Scale
12.2.3— The Piriform Cortex Network Model
12.3— Modeling Objectives
12.4— Overall Structure of Piriform Cortex and the Model
12.5— Simplifying Network Components
12.5.1— Connections between Individual Neurons
12.5.2— Numbers of Neurons
12.5.3— Types of Neurons
12.5.4— Biophysical Properties
12.6— Modeling Results
12.6.1— Tuning Network Parameters
12.6.2— Simulating the Electroencephalogram
12.6.3— Functional Significance
12.7— Detailed Model of a Single Pyramidal Cell
12.7.1— Structure of the Pyramidal Cell Model
12.7.2— Passive Properties
12.7.3— Active Conductances
12.7.4— Synaptic Conductances
12.7.5— Simplifying Cellular Components
12.7.6— Tuning Neuronal Parameters
12.7.7— Response to Synaptic Input
12.8— Refining the Network Model
12.8.1— Reducing the Single-Cell Model
12.8.2— The Costs of Model Simplification
12.9— Discussion
Acknowledgments
Appendix A— Using the Model to Generate Field Potential Events
Appendix B— Neuronal Objects in GENESIS
Compartmental Representation
Voltage- and Calcium-Gated Currents
Synaptic Currents
Appendix C— Numerical Methods
Explicit Methods
Forward Euler
Adams-Bashforth
Exponential Euler
Implicit Methods
Backward Euler
Crank-Nicholson
Hines Method for Solving Branched Dendritic Trees
Appendix D— Network Connections
Appendix E— Additional Features of GENESIS
Simulating Synaptic Plasticity
Parameter Search Routines
Parallel GENESIS
Chemical Kinetics
Appendix F— Model Parameters
Parameters for Full and Reduced Single-Cell Models
Fast Sodium Current
Persistent Sodium Current
Potassium Delayed Rectifier
Potassium A-Current
Potassium M-Current
Potassium Slow AHP Current
Fast Calcium Current
Slow Calcium Current
Synaptic Currents
Parameters for the Piriform Cortex Network Model
13— Modeling Feature Selectivity in Local Cortical Circuits
13.1— Introduction
13.2— Model of a Cortical Hypercolumn
13.2.1— Network Architecture
13.2.2— Network Dynamics
12.3— One-Population Rate Model
13.4— Stationary Activity Profiles
13.4.1— Broad Activity Profile
13.4.2— Narrow Activity Profile
13.4.3— Weakly Modulated Cortical Interactions
Afferent Mechanism of Feature Selectivity
Uniform Cortical Inhibition
General Case
13.4.4— Strongly Modulated Cortical Interactions
Homogeneous Input: Marginal Phase
Tuned Input
13.5— Moving Activity Profiles
13.5.1— Response to Changing Stimulus Feature
Response to Sudden Change in Stimulus Feature–Virtual Rotation
Locking to a Moving Stimulus Feature
Slow Stimulus(V \\u003c V[Sub(C)])
Fast Stimulus(V \\u003e V[Sub(C)])
13.5.2— Intrinsic Moving Profiles
Modeling Neuronal Adaptation
13.6— Model with Short-Range Excitation
13.6.1— Broad Activity Profile
13.6.2— Narrow Profiles and Marginal Phase
13.6.3— Tuned Input
Narrow Stimulus
Strongly Tuned Input
Weakly Tuned Input
Broad Stimulus
13.6.4— Intrinsic Moving Profiles
13.7— Network Model with Conductance-Based Dynamics
13.7.1— Conductance-Based Dynamics of Point Neurons
13.7.2— Mean Field Theory of Asynchronous States
13.7.3— Details of the Numerical Simulations
Single-Neuron Dynamics
Network Architecture
13.7.4— The Marginal Phase
Stationary Stimulus
Virtual Rotation
13.7.5— Network with Adaptation-Intrinsic Moving Profiles
13.8— Discussion
Acknowledgments
Appendix A— Solution of the One-Population Rate Model
General Time-Dependent Equations
Stationary State
Response to Moving Stimulus
Appendix B— Stability of the Stationary States
Stability of the Broad Profile
Stability of the Marginal Phase
Instability of the Marginal Phase Due to Adaptation
Appendix C— Details of Conductance-Based Model
Sodium Current: I[Sub(Na)] = g[Sub(Na)]m[Sup(3)]h(V – V[Sub(Na)])
Delayed-Rectifier Potassium Current: I[Sub(K)] = g[Sub(K)]n[Sup(4)](V – V[Sub(K)])
A-Current: I[Sub(A)] = g[Sub(A)]ab(V – V[Sub(A)])
Persistent Sodium Current: I[Sub(NaP) = g[Sub(NAP)]s[Sub([infinity])](V)(V – V[Sub(Na)])
Slow Potassium Current: I[Sub(z)] = g[Sub(z)]z(V – V[Sub(K)])
14— Numerical Methods for Neuronal Modeling
14.1— Introduction
14.1.1— Numerical Preliminaries
14.2— Methods for Ordinary Differential Equations
14.2.1— Runge-Kutta Methods
14.2.2— Multistep Methods
14.2.3— Methods with Adaptive Step Size
14.2.4— Qualitative Analysis of Stiffness
14.2.5— Methods for Stiff Systems
14.2.6— Boundary Value Problems
14.2.7— Problems with Discontinuities
14.2.8— Guide to Method Selection and Packages
14.3— Methods for Partial Differential Equations
14.3.1— Finite-Difference Methods
14.3.2— Boundary Conditions
14.3.3— Spatial Variation
14.3.4— Solving Tridiagonal Linear Systems
14.3.5— Branching
14.3.6— Nonlinear Equations
14.3.7— Networks
14.3.8— Concluding Remarks and Suggestions for PDEs
14.4— Final Comments
Acknowledgments
Appendix: Stiffness
References
Contributors
Index
A
B
C
D
E
F
G
H
I
K
L
M
N
O
P
Q
R
S
T
U
V
W
X
Y
Z

已确认勘误

次印刷

页码 勘误内容 提交人 修订印次

    • 名称
    • 类型
    • 大小

    光盘服务联系方式: 020-38250260    客服QQ:4006604884

    意见反馈

    14:15

    关闭

    云图客服:

    尊敬的用户,您好!您有任何提议或者建议都可以在此提出来,我们会谦虚地接受任何意见。

    或者您是想咨询:

    用户发送的提问,这种方式就需要有位在线客服来回答用户的问题,这种 就属于对话式的,问题是这种提问是否需要用户登录才能提问

    Video Player
    ×
    Audio Player
    ×
    pdf Player
    ×
    Current View

    看过该图书的还喜欢

    some pictures

    解忧杂货店

    东野圭吾 (作者), 李盈春 (译者)

    loading icon