Friday, April 15, 2011

Algorithms and Architectures






Contents
Contributors xv
Preface xix
Statistical Theories of Learning in Radial Basis
Function Networks
Jason A. S. Freeman, Mark }. L Orr, and David Saad
I. Introduction 1
A. Radial Basis Function Network 2
II. Learning in Radial Basis Function Networks 4
A. Supervised Learning 4
B. Linear Models 5
C. Bias and Variance 9
D. Cross-Validation 11
E. Ridge Regression 13
F. Forward Selection 17
G. Conclusion 19
III. Theoretical Evaluations of Network Performance 21
A. Bayesian and Statistical Mechanics Approaches 21
B. Probably Approximately Correct Framework 31
C. Approximation Error/Estimation Error 37
D. Conclusion 39
IV. Fully Adaptive Training—An Exact Analysis 40
A. On-Line Learning in Radial Basis
Function Networks 41
B. Generalization Error and System Dynamics 42
C. Numerical Solutions 43
D. Phenomenological Observations 45
E. Symmetric Phase 47
F. Convergence Phase 49
G. Quantifying the Variances 50
H. Simulations 52
I. Conclusion 52
V. Summary 54
Appendix 55
References 57
Synthesis of Three-Layer Threshold Networks
Jung Hwan Kim, Sung-Kwon Park, Hyunseo Oh, and Youngnam Han
I. Introduction 62
II. Preliminaries 63
III. Finding the Hidden Layer 64
IV. Learning an Output Layer 73
V. Examples 77
A. Approximation of a Circular Region 77
B. Parity Function 80
C. 7-Bit Function 83
VI. Discussion 84
VII. Conclusion 85
References 86
Weight Initialization Techniques
Mikko Lehtokangas, Petri Salmela, Jukka Saarinen, and Kimmo Kaski
I. Introduction 87
II. Feedforward Neural Network Models 89
A. Multilayer Perceptron Networks 89
B. Radial Basis Function Networks 90
III. Stepwise Regression for Weight Initialization 90
IV. Initialization of Multilayer Perceptron Networks 92
A. Orthogonal Least Squares Method 92
B. Maximum Covariance Method 93
C. Benchmark Experiments 93
y. Initial Training for Radial Basis Function Networks 98
A. Stepwise Hidden Node Selection 98
B. Benchmark Experiments 99
VI. Weight Initialization in Speech
Recognition Application 103
A. Speech Signals and Recognition 103
B. Principle of the Classifier 104
C. Training the Hybrid Classifier 106
D. Results 109
VII. Conclusion 116
Appendix I: Chessboard 4 X 4 116
Appendix II: Two Spirals 117
Appendix III: GaAs MESFET 117
Appendix IV: Credit Card 117
References 118
Fast Computation in Hamming and Hopfield Networks
Isaac Meilijson, Eytan Ruppin, and Moshe Sipper
I. General Introduction 123
II. Threshold Hamming Networks 124
A. Introduction 124
B. Threshold Hamming Network 126
C. Hamming Network and an Optimal Threshold
Hamming Network 128
D. Numerical Results 132
E. Final Remarks 134
III. Two-Iteration Optimal Signaling in
Hopfield Networks 135
A. Introduction 135
B. Model 137
C. Rationale for Nonmonotone Bayesian Signaling
D. Performance 142
E. Optimal Signaling and Performance 146
F. Results 148
G. Discussion 151
IV. Concluding Remarks 152
References 153
Multilevel Neurons
/. Si and A. N. Michel
I. Introduction 155
II. Neural System Analysis 157
A. Neuron Models 158
B. Neural Networks 160
C. Stability of an Equilibrium 162
D. Global Stability Results 164
III. Neural System Synthesis for Associative Memories 167
A. System Constraints 168
B. Synthesis Procedure 170
IV. Simulations 171
V. Conclusions and Discussions 173
Appendix 173
References 178
Probabilistic Design
Sumio Watanabe and Kenji Fukumizu
I. Introduction 181
II. Unified Framework of Neural Networks 182
A. Definition 182
B. Learning in Artificial Neural Networks 185
III. Probabilistic Design of Layered Neural Networks 189
A. Neural Network That Finds Unknown Inputs 189
B. Neural Network That Can Tell the Reliability of Its
Own Inference 192
C. Neural Network That Can Illustrate Input Patterns for a
Given Category 196
IV. Probability Competition Neural Networks 197
A. Probability Competition Neural Network Model and Its
Properties 198
B. Learning Algorithms for a Probability Competition
Neural Network 203
C. Applications of the Probability Competition
Neural Network Model 210
V. Statistical Techniques for Neural Network Design 218
A. Information Criterion for the Steepest Descent 218
B. Active Learning 225
VI. Conclusion 228
References 228
Short Time Memory Problems
M. Daniel Tom and Manoel Fernando Tenorio
I. Introduction 231
II. Background 232
III. Measuring Neural Responses 233
IV. Hysteresis Model 234
V. Perfect Memory 237
VI. Temporal Precedence Differentiation 239
VII. Study in Spatiotemporal Pattern Recognition 241
VIII. Conclusion 245
Appendix 246
References 260
Reliability Issue and Quantization Effects in Optical
and Electronic Network Implementations of
Hebbian-Type Associative Memories
Pau-Choo Chung and Ching-Tsorng Tsai
I. Introduction 261
II. Hebbian-Type Associative Memories 264
A. Linear-Order Associative Memories 264
B. Quadratic-Order Associative Memories 266
III. Network Analysis Using a Signal-to-Noise
Ratio Concept 266
IV. Reliability Effects in Network Implementations 268
A. Open-Circuit Effects 269
B. Short-Circuit Effects 274
y. Comparison of Linear and Quadratic Networks 278
VI. Quantization of Synaptic Interconnections 281
A. Three-Level Quantization 282
B. Three-Level Quantization with
Conserved Interconnections 286
VII. Conclusions 288
References 289
Finite Constraint Satisfaction
Angela Monfroglio
I. Constrained Heuristic Search and Neural Networks for Finite
Constraint Satisfaction Problems 293
A. Introduction 293
B. Shared Resource Allocation Algorithm 295
C. Satisfaction of a Conjunctive Normal Form 300
D. Connectionist Networks for Solving ^-Conjunctive Normal
Form Satisfiability Problems 305
E. Other Connectionist Paradigms 311
F. Network Performance Summary 317
II. Linear Programming and Neural Networks 323
A. Conjunctive Normal Form Satisfaction and
Linear Programming 324
B. Connectionist Networks That Learn to Choose the
Position of Pivot Operations 329
III. Neural Networks and Genetic Algorithms 331
A. Neural Network 332
B. Genetic Algorithm for Optimizing the
Neural Network 336
C. Comparison with Conventional Linear Programming
Algorithms and Standard Constraint Propagation and
Search Techniques 337
D. Testing Data Base 340
IV. Related Work, Limitations, Further Work,
and Conclusions 341
Appendix I. Formal Description of the Shared Resource
Allocation Algorithm 342
Appendix II. Formal Description of the Conjunctive Normal
Form Satisfiability Algorithm 346
A. Discussion 348
Appendix III. A 3-CNF-SAT Example 348
Appendix IV. Outline of Proof for the Linear
Programming Algorithm 350
A. Preliminary Considerations 350
B. Interior Point Methods 357
C. Correctness and Completeness 358
References 359
Parallel, Self-Organizing, Hierarchical Neural
Network Systems
O. K. Ersoy
I. Introduction 364
II. Nonlinear Transformations of Input Vectors 366
A. Binary Input Data 366
B. Analog Input Data 366
C. Other Transformations 367
III. Training, Testing, and Error-Detection Bounds 367
A. Training 367
B. Testing 368
C. Detection of Potential Errors 368
IV. Interpretation of the Error-Detection Bounds 371
V. Comparison between the Parallel, Self-Organizing,
Hierarchical Neural Network, the Backpropagation Network,
and the Maximum Likelihood Method 373
A. Normally Distributed Data 374
B. Uniformly Distributed Data 379
VI. PNS Modules 379
VTI. Parallel Consensual Neural Networks
A. Consensus Theory 382
B. Implementation 383
C. Optimal Weights 384
D. Experimental Results 385
VIII. Parallel, Self-Organizing, Hierarchical Neural Networks with
Competitive Learning and Safe Rejection Schemes 385
A. Safe Rejection Schemes 387
B. Training 389
C. Testing 390
D. Experimental Results 392
IX. Parallel, Self-Organizing, Hierarchical Neural Networks with
Continuous Inputs and Outputs 392
A. Learning of Input Nonlinearities by
Revised Backpropagation 393
B. Forward-Backward Training 394
X. Recent Applications 395
A. Fuzzy Input Signal Representation 395
B. Multiresolution Image Compression 397
XL Conclusions 399
References 399
Dynamics of Networks of Biological Neurons:
Simulation and Experimental Tools
M. Bove, M. Giugliano, M. Grattarola, S. Martinoia, and G. Massobrio
I. Introduction 402
11. Modeling Tools 403
A. Conductance-Based Single-Compartment Differential
Model Neurons 403
B. Integrate-and-Fire Model Neurons 409
C. Synaptic Modeling 412
III. Arrays of Planar Microtransducers for Electrical Activity
Recording of Cultured Neuronal Populations 418
A. Neuronal Cell Cultures Growing on Substrate
Planar Microtransducers 419
B. Example of a Multisite Electrical Signal Recording from
Neuronal Cultures by Using Planar Microtransducer
Arrays and Its Simulations 420
VI. Concluding Remarks 421
References 422
Estimating the Dimensions of Manifolds Using
Delaunay Diagrams
Yun-Chung Chu
I. Delaunay Diagrams of Manifolds 425
II. Estimating the Dimensions of Manifolds 435
III. Conclusions 455
References 456
Index 457


Another Neural Network Books
Another The Core of CS Books Books
Download

No comments:

Post a Comment

Related Posts with Thumbnails

Put Your Ads Here!