Showing posts with label Neural Network. Show all posts
Showing posts with label Neural Network. Show all posts

Monday, April 16, 2012

Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering






Nikola K. Kasabov
A Bradford Book
The MIT Press
Cambridge, Massachusetts
London, England

Contents
Foreword by Shun-ichi Amari xi
Preface xiii
1 The Faculty of Knowledge Engineering and Problem Solving 1
1.1 Introduction to AI paradigms 1
1.2 Heuristic problem solving; genetic algorithms 3
1.3 Why expert systems, fuzzy systems, neural networks, and hybrid systems
for knowledge engineering and problem solving?
14
1.4 Generic and specific AI problems: Pattern recognition and classification 19
1.5 Speech and language processing 28
1.6 Prediction 42
1.7 Planning, monitoring, diagnosis, and control 49
1.8 Optimization, decision making, and games playing 57
1.9 A general approach to knowledge engineering 65
1.10 Problems and exercises 68
1.11 Conclusion 72
1.12 Suggested reading 732 Knowledge Engineering and Symbolic Artificial Intelligence 75
2.1 Data, information, and knowledge: Major issues in knowledge engineering 75
2.2 Data analysis, data representation, and data transformation 80
2.3 Information structures and knowledge representation 89
2.4 Methods for symbol manipulation and inference: Inference as matching;
inference as a search
100
2.5 Propositional logic 110
2.6 Predicate logic: PROLOG 113
2.7 Production systems 118
2.8 Expert systems 128
2.9 Uncertainties in knowledge-based systems: Probabilistic methods 132
2.10 Nonprobabilistic methods for dealing with uncertainties 140
2.11 Machine-learning methods for knowledge engineering 146
2.12 Problems and exercises 155
2.13 Conclusion 164
2.14 Suggested reading 164
3 From Fuzzy Sets to Fuzzy Systems 167
3.1 Fuzzy sets and fuzzy operations 167
3.2 Fuzziness and probability;
conceptualizing in fuzzy terms; the
extension principle
175
3.3 Fuzzy relations and fuzzy
implications; fuzzy propositions and
fuzzy logic
184
3.4 Fuzzy rules, fuzzy inference
methods, fuzzification and
defuzzification
192
3.5 Fuzzy systems as universal
approximators; Interpolation of
fuzzy rules
205
3.6 Fuzzy information retrieval and
fuzzy databases
208
3.7 Fuzzy expert systems 215
3.8 Pattern recognition and
classification, fuzzy clustering,
image and speech processing
223
3.9 Fuzzy systems for prediction 229
3.10 Control, monitoring, diagnosis,
and planning
230
3.11 Optimization and decision
making
234
3.12 Problems and exercises 236
3.13 Conclusion 248
3.14 Suggested reading 249
4 Neural Networks: Theoretical and
Computational Models
251
4.1 Real and artificial neurons 251
4.2 Supervised learning in neural
networks: Perceptrons and
multilayer perceptrons
267
4.3 Radial basis functions, time-
delay neural networks, recurrent
networks
282
4.4 Neural network models for
unsupervised learning:
288 4.5 Kohonen self-organizing
topological maps
293
4.6 Neural networks as associative
memories
300
4.7 On the variety of neural network
models
307
4.8 Fuzzy neurons and fuzzy neural
networks
314
4.9 Hierarchical and modular
connectionist systems
320
4.10 Problems 323
4.11 Conclusion 3284.12 Suggested reading 329

Page ix
5 Neural Networks for Knowledge Engineering and Problem Solving 331
5.1 Neural networks as a problem-solving paradigm 331
5.2 Connectionist expert systems 340
5.3 Connectionist models for knowledge acquisition: One rule is worth a
thousand data examples
347
5.4 Symbolic rules insertion in neural networks: Connectionist production
systems
359
5.5 Connectionist systems for pattern recognition and classification; image
processing
365
5.6 Connectionist systems for speech processing 375
5.7 Connectionist systems for prediction 388
5.8 Connectionist systems for monitoring, control, diagnosis, and planning 398
5.9 Connectionist systems for optimization and decision making 402
5.10 Connectionist systems for modeling strategic games 405
5.11 Problems 409
5.12 Conclusions 418
5.13 Suggested reading 4186 Hybrid Symbolic, Fuzzy, and Connectionist Systems: Toward Comprehensive
Artificial Intelligence
421
6.1 The hybrid systems paradigm 421
6.2 Hybrid connectionist production systems 429
6.3 Hybrid connectionist logic programming systems 433
6.4 Hybrid fuzzy connectionist production systems 435
6.5 ("Pure") connectionist production systems: The NPS architecture
(optional)
442
6.6 Hybrid systems for speech and language processing 455
6.7 Hybrid systems for decision making 460
6.8 Problems 462
6.9 Conclusion 473
6.10 Suggested reading 473
7 Neural Networks, Fuzzy Systems and Nonlinear Dynamical Systems Chaos;
Toward New Connectionist and Fuzzy Logic Models
475
7.1 Chaos 475
7.2 Fuzzy systems and chaos: New developments in fuzzy systems 481
Page x
7.3 Neural networks and chaos: New developments in neural networks 486
7.4 Problems 497
7.5 Conclusion 502
7.6 Suggested reading 503
Appendixes 505
References 523
Glossary 539
Index 547


Neural network - Wikipedia, the free encyclopedia
Neural Networks
Fuzzy logic - Wikipedia, the free encyclopedia
Fuzzy Logic Tutorial - An Introduction

Neural Networks Theory
Introduction to Fuzzy Sets, Fuzzy Logic, and Fuzzy Control Systems
Other Neural Network Books
Other Fuzzy Logic Books
Download

Wednesday, January 25, 2012

Data Mining With Neural Networks Solving Business Problems From Application Development To Decision Support






Overview

This book is targeted directly at executives, managers, and computer professionals by explaining data mining and neural networks from a business information systems and management perspective. It presents data mining with neural networks as a strategic business technology, with the focus on the practical, competitive advantages they offer. In addition the book provides a general methodology for neural network data mining and application developmcnt using a set of realistic business problems as examples. The examples are developed using a commercially available neural network data mining tool, the IBM Neural Network Utility.

Is your organization storing huge quantities of data that could be useful--but you don't know where to start? Do you have information about customers, distributors, markets, and competitors that you are not using to full advantage? And are you personally more interested in strategic applications and general overviews than mind-numbing equations and printouts of code? If so, Data Mining with Neural Networks is the book for you. Written for a business audience, it explains how your company can mine a vast amount of data and transform it into strategic action. Highly Recommended for any company that wants to develop sound plans based on powerful quantitatitive and analytical methods.


Table of Content

Part 1 The Data Mining Process Using Neural Networks

Chapter 1. Introduction to Data Minlng
Chapter 2. Introduction to Neural Networks
Chapter 3. Data Preparation
Chapter 4. Neural Network Models and Architectures
Chapter 5. Training and Testing Neural Network
Chapter 6. Analyzing Neural Networks for Decision Support
Chapter 7. Deploying Neural Network Application
Chapter 8. Intelligent Agents and Automated Data Mining

Part 2 Data Mining Application Case Studie

Chapter 9. Market Segmentation
Chapter 10. Real Estate Pricing Model
Chapter 11. Customer Ranking Model
Chapter 12. Sales Forecasting

Appendix A. IBM Neural Network Utility
Appendix B. Fuzzy Logic
Appendix C. Genetic Algorithms

Other Data Mining Books
Other Neural Network Books
Download

Friday, January 13, 2012

Neural Network in Finance and Investing






Robert R. Trippi, Efraim Turban, editors.
Published 1996 by Irwin Professional Pub. in Chicago .
Written in English.
Edition Notes

Includes bibliographical references and indexes.
"A collection of survey and research articles"--Pref.
Includes a complimentary version of ThinksPro neural network software.
System requirements for accompanying computer disk: IBM-compatible PC; Windows 95 or Windows 3.1.

Neural Networks in Finance and Investing, Revised 2/E is an updated and expanded edition of the first-ever book on financial applications of neural networks. Robert Trippi and Efraim Turban have assembled here a stellar collection of articles by experts in industry and academia on applications of neural networks in this important arena. This widely-acclaimed classic provides portfolio managers, institutional investors, bankers, and analysts with a comprehensive and fascinating introduction to this important technology and numerous insights into its most effective use. Neural network successes and failures are discussed, as well as the vast unrealized potential of neural networks in numerous specialized areas of financial decision making. Topics include:

* Neural Network Fundamentals and Overview
* Analysis of Financial Condition
* Business Failure Prediction
* Debt Risk Assessment
* Stock Market Applications
* Futures and Options Markets Applications
* Neural Network Approaches to Financial Forecasting

Included as a bonus with this new edition is a complimentary version of ThinksPro for Windows, a full-featured neural network software package that can be used for many of the applications described in the book.

Nowhere else will the financial technology professional find such an exciting and relevant in-depth examination of neural networks. Individual chapters discuss how to use neural networks to forecast the stock market, to trade commodities, to assess bond and mortgage risk, to predict bankruptcy, and to implement investment strategies. Taken together, this comprehensive collection provides a fascinating and authoritative introduction to a technology that is revolutionizing the way financial services firms operate.

This unique volume is truly essential reading for anyone wishing to stay abreast of this "cutting edge" technology.


Other Neural Network Books
Download

Saturday, December 31, 2011

Introduction to Neural Networks






Kevin Gurney
University of Sheffield

Contents
Preface vii
1 Neural networks—an overview 1
1.1 What are neural networks? 1
1.2 Why study neural networks? 3
1.3 Summary 4
1.4 Notes 4
2 Real and artificial neurons 5
2.1 Real neurons: a review 5
2.2 Artificial neurons: the TLU 8
2.3 Resilience to noise and hardware failure 10
2.4 Non-binary signal communication 11
2.5 Introducing time 12
2.6 Summary 14
2.7 Notes 15
3 TLUs, linear separability and vectors 16
3.1 Geometric interpretation of TLU action 16
3.2 Vectors 18
3.3 TLUs and linear separability revisited 22
3.4 Summary 23
3.5 Notes 24
4 Training TLUs: the perceptron rule 25
4.1 Training networks 25
4.2 Training the threshold as a weight 25
4.3 Adjusting the weight vector 26
4.4 The perceptron 28
4.5 Multiple nodes and layers 29
4.6 Some practical matters 31
4.7 Summary 33
4.8 Notes 33
5 The delta rule 34
5.1 Finding the minimum of a function: gradient descent 34
5.2 Gradient descent on an error 36
5.3 The delta rule 37
5.4 Watching the delta rule at work 39
5.5 Summary 40
6 Multilayer nets and backpropagation 41
6.1 Training rules for multilayer nets 41
6.2 The backpropagation algorithm 42
6.3 Local versus global minima 43
6.4 The stopping criterion 44
6.5 Speeding up learning: the momentum term 44
6.6 More complex nets 45
6.7 The action of well-trained nets 46
6.8 Taking stock 50
6.9 Generalization and overtraining 50
6.10 Fostering generalization 52
6.11 Applications 54
6.12 Final remarks 56
6.13 Summary 56
6.14 Notes 56
7 Associative memories: the Hopfield net 57
7.1 The nature of associative memory 57
7.2 Neural networks and associative memory 58
7.3 A physical analogy with memory 58
7.4 The Hopfield net 59
7.5 Finding the weights 64
7.6 Storage capacity 66
7.7 The analogue Hopfield model 66
7.8 Combinatorial optimization 67
7.9 Feedforward and recurrent associative nets 68
7.10 Summary 69
7.11 Notes 69
8 Self-organization 70
8.1 Competitive dynamics 70
8.2 Competitive learning 72
8.3 Kohonen’s self-organizing feature maps 75
8.4 Principal component analysis 85
8.5 Further remarks 87
8.6 Summary 88
8.7 Notes 88
9 Adaptive resonance theory: ART 89
9.1 ART’s objectives 89
9.2 A hierarchical description of networks 90
9.3 ART1 91
9.4 The ART family 98
9.5 Applications 98
9.6 Further remarks 99
9.7 Summary 100
9.8 Notes 100
10 Nodes, nets and algorithms: further alternatives 101
10.1 Synapses revisited 101
10.2 Sigma-pi units 102
10.3 Digital neural networks 103
10.4 Radial basis functions 110
10.5 Learning by exploring the environment 112
10.6 Summary 115
10.7 Notes 116
11 Taxonomies, contexts and hierarchies 117
11.1 Classifying neural net structures 117
11.2 Networks and the computational hierarchy 120
11.3 Networks and statistical analysis 122
11.4 Neural networks and intelligent systems: symbols versus neurons 122
11.5 A brief history of neural nets 126
11.6 Summary 127
11.7 Notes 127
A The cosine function 128
References 130
Index 135


Another Neural Network Books
Download

Thursday, December 22, 2011

Computational Neural Networks for Geophysical Data Processing, Volume 30






T A B L E OF C O N T E N T S
Preface
Contributing Authors
Part I Introduction to Computational Neural Networks
Chapter 1 A Brief History
1. Introduction
2. Historical Development
2.1. Mcculloch and Pitts Neuron
2.2. Hebbian Learning
2.3. Neurocomputing
2.4. Perceptron
2.5. ADALINE
2.6. Caianiello Neurons
2.7. Limitations
2.8. Next Generation
Chapter 2 Biological Versus Computational Neural Networks
1. Computational Neural Networks
2. Biological Neural Networks
3. Evolution of the Computational Neural Network
Chapter 3 Multi-Layer Perceptrons and Back-Propagation Learning
1. Vocabulary
2. Back-Propagation
3. Parameters
3.1. Number of Hidden Layers
3.2. Number of Hidden Pes
3.3. Threshold Function
3.4. Weight Initialization
3.5. Learning Rate and Momentum
3.6. Bias
3.7. Error Accumulation
3.8. Error Calculation
3.9. Regularization and Weight Decay
4. Time-Varying Data
Chapter 4 Design of Training and Testing Sets
1. Introduction
2. Re-Scaling
3. Data Distribution
4. Size Reduction
5. Data Coding
6. Order of Data
Chapter 5 Alternative Architectures and Learning Rules
1. Improving on Back-Propagation
1.1. Delta Bar Delta
1.2. Directed Random Search
1.3. Resilient Back-Propagation
1.4. Conjugate Gradient
1.5. Quasi-Newton Method
1.6. Levenberg-Marquardt
2. Hybrid Networks
2.1. Radial Basis Function Network
2.2. Modular Neural Network
2.3. Probabilistic Neural Network
2.4. Generalized Regression Neural Network
3. Alternative Architectures
3.1. Self Organizing Map
3.2. Hopfield Networks
3.3. Adaptive Resonance theory
Chapter 6 Software and Other Resources
1. Introduction
2. Commercial Software Packages
3. Open Source Software
4. News Groups
Part II Seismic Data Processing
Chapter 7 Seismic Interpretation and Processing Applications
1. Introduction
2. Waveform Recognition
3. Picking Arrival Times
4. Trace Editing
5. Velocity Analysis
6. Elimination of Multiples
7. Deconvolution
8. Inversion
Chapter 8 Rock Mass and Reservoir Characterization
1. Introduction
2. Horizon Tracking and Facies Maps
3. Time-Lapse Interpretation
4. Predicting Log Properties
5. Rock/Reservoir Characterization
Chapter 9 Identifying Seismic Crew Noise
1. Introduction
1.1. Current Attenuation Methods
1.2. Patterns of Crew Noise Interference
1.3. Pre-Processing
2. Training Set Design and Network Architecture
2.1. Selection of Interference Training Examples
2.2. Selection of Signal Training Patterns
3. Testing
4. Analysis of Training and Testing
4.1. Sensitivity to Class Distribution
4.2. Sensitivity to Network Architecture
4.3. Effect of Confidence Level During Overlapping Window Tabulation
4.4. Effect of NMO Correction
5. Validation
5.1. Effect on Deconvolution
5.2. Effect on CMP Stacking
6. Conclusions
Chapter 10 Self-Organizing Map (SOM) Network for Tracking
Horizons and Classifying Seismic Traces
1. Introduction
2. Self-Organizing Map Network
3. Horizon Tracking
3.1. Training Set
3.2. Results
4. Classification of the Seismic Traces
4.1. Window Length and Placement
4.2. Number of Classes
5. Conclusions
Chapter 11 Permeability Estimation with an RBF Network and
Levenberg-Marquardt Learning
1. Introduction
2. Relationship Between Seismic and Petrophysical Parameters
2.1. RBF Network Training
2.2. Predicting Hydraulic Properties From Seismic Information: Relation
Between Velocity and Permeability
3. Parameters That Affect Permeability: Porosity, Grain Size, Clay Content
4. Neural Network Modeling of Permeability Data
4.1. Data Analysis and Interpretation
4.2. Assessing the Relative Importance of Individual Input Attributes
5. Summary and Conclusions
Chapter 12 Caianiello Neural Network Method for Geophysical
Inverse Problems
1. Introduction
2. Generalized Geophysical Inversion
2.1. Generalized Geophysical Model
2.2. Ill-Posedness and Singularity
2.3. Statistical Strategy
2.4. Ambiguous Physical Relationship
3. Caianiello Neural Network Method
3.1. Mcculloch-Pitts Neuron Model
3.2. Caianiello Neuron Model
3.3. The Caianiello Neuron-Based Multi-Layer Network
3.4. Neural Wavelet Estimation
3.5. Input Signal Reconstruction
3.6. Nonlinear Factor Optimization
4. Inversion With Simplified Physical Models
4.1. Simplified Physical Model
4.2. Joint Impedance Inversion Method
4.3. Nonlinear Transform
4.4. Joint Inversion Step 1: MSI and MS Wavelet Extraction At the Wells
4.5. Joint Inversion Step 2: Initial Impedance Model Estimation
4.6. Joint Inversion Step 3: Model-Based Impedance Improvement
4.7. Large-Scale Stratigraphic Constraint
5. Inversion With Empirically-Derived Models
5.1. Empirically Derived Petrophysical Model for the Trend
5.2. Neural Wavelets for Scatter Distribution
5.3. Joint Inversion Strategy
6. Example
7. Discussions and Conclusions
Part III Non-Seismic Applications
Chapter 13 Non-Seismic Applications
1. Introduction
2. Well Logging
2.1. Porosity and Permeability Estimation
2.2. Lithofacies Mapping
3. Gravity and Magnetics
4. Electromagnetics
4.1. Frequency-Domain
4.2. Time-Domain
4.3. Magnetotelluric
4.4.. Ground Penetrating Radar
5. Resistivity
6. Multi-Sensor Data
Chapter 14 Detection of AEM Anomalies Corresponding to
Dike Structures
1. Introduction
2. Airborne Electromagnetic Method- Theoretical Background
2.1. General
2.2 Forward Modeling for 1 Dimensional Models
2.3. Forward Modelling for 2 Dimensional Models With EMIGMA
3. Feedforward Computational Neural Networks (CNN)
4. Concept
5. CNNs to Calculate Homogeneous Halfspaces
6. CNN for Detecting 2D Structures
6.1. Training and Test Vectors
6.2. Calculation of the Error Term (+1 ppm, +2ppm)
6.3. Calculation of the Random Models (Model Categories 6-8)
6.4. Training
7. Testing
8. Conclusion
Chapter 15 Locating Layer Boundaries with Unfocused
Resistivity Tools
1. Introduction
2. Layer Boundary Picking
3. Modular Neural Network
4. Training With Multiple Logging Tools
4.1. Mnn, Mlp, and Rbf Architectures
4.2. Rprop and Grnn Architectures
5. Analysis of Results
5.1. Thin Layer Model (Thickness From 0.5 to 2 M)
5.2. Medium-Thickness Layer Model (Thickness From 1.5 to 4 M)
5.3. Thick Layer Model (Thickness From 6 to 16 M)
5.4. Testing the Sensitivity to Resistivity
6. Conclusions
Chapter 16 A Neural Network Interpretation System for Near-Surface
Geophysics Electromagnetic Ellipticity Soundings
1. Introduction
2. Function Approximation
2.1. Background
2.2. Radial Basis Function Neural Network
3. Neural Network Training
4. Case History
4.1. Piecewise Half-Space Interpretation
4.2. Half-Space Interpretations
5. Conclusion
Chapter 17 Extracting IP Parameters From TEM Data
1. Introduction
2. Forward Modeling
3. Inverse Modeling With Neural Networks
4. Testing Results
4.1. Half-Space
4.2. Layered Ground
4.3. Polarizable First Layer
4.4. Polarizable Second Layer
5. Uncertainty Evaluation
6. Sensitivity Evaluation
7. Case Study
8. Conclusions
Author Index
Index

Another Neural Network Books
Download

A Guide to Neural Computing Aplications







Contents
Part I Introduction
1 Computational Intelligence: An Introduction................................................ 3
1.1 Introduction .............................................................................................. 3
1.2 Soft Computing......................................................................................... 3
1.3 Probabilistic Reasoning ............................................................................ 4
1.4 Evolutionary Computation........................................................................ 6
1.5 Computational Intelligence....................................................................... 8
1.6 Hybrid Computational Technology .......................................................... 9
1.7 Application Areas ................................................................................... 10
1.8 Applications in Industry ......................................................................... 11
References .............................................................................................. 12
2 Traditional Problem Definition ..................................................................... 17
2.1 Introduction to Time Series Analysis ..................................................... 17
2.2 Traditional Problem Definition............................................................... 18
2.2.1 Characteristic Features .............................................................. 18
2.2.1.1 Stationarity .................................................................. 18
2.2.1.2 Linearity ...................................................................... 20
2.2.1.3 Trend............................................................................ 20
2.2.1.4 Seasonality................................................................... 21
2.2.1.5 Estimation and Elimination of Trend and
Seasonality................................................................... 21
2.3 Classification of Time Series.................................................................. 22
2.3.1 Linear Time Series .................................................................... 23
2.3.2 Nonlinear Time Series............................................................... 23
2.3.3 Univariate Time Series.............................................................. 23
2.3.4 Multivariate Time Series........................................................... 24
2.3.5 Chaotic Time Series .................................................................. 24
2.4 Time Series Analysis .............................................................................. 25
2.4.1 Objectives of Analysis .............................................................. 25
2.4.2 Time Series Modelling.............................................................. 26
2.4.3 Time Series Models................................................................... 26
2.5 Regressive Models.................................................................................. 27
2.5.1 Autoregression Model .............................................................. 27
2.5.2 Moving-average Model ............................................................ 28
2.5.3 ARMA Model ........................................................................... 28
2.5.4 ARIMA Model.......................................................................... 29
2.5.5 CARMAX Model...................................................................... 32
2.5.6 Multivariate Time Series Model................................................ 33
2.5.7 Linear Time Series Models ....................................................... 35
2.5.8 Nonlinear Time Series Models.................................................. 35
2.5.9 Chaotic Time Series Models ..................................................... 36
2.6 Time-domain Models.............................................................................. 37
2.6.1 Transfer-function Models.......................................................... 37
2.6.2 State-space Models.................................................................... 38
2.7 Frequency-domain Models ..................................................................... 39
2.8 Model Building....................................................................................... 42
2.8.1 Model Identification.................................................................. 43
2.8.2 Model Estimation ...................................................................... 45
2.8.3 Model Validation and Diagnostic Check .................................. 48
2.9 Forecasting Methods............................................................................... 49
2.9.1 Some Forecasting Issues ........................................................... 50
2.9.2 Forecasting Using Trend Analysis ............................................ 51
2.9.3 Forecasting Using Regression Approaches ............................... 51
2.9.4 Forecasting Using the Box-Jenkins Method.............................. 53
2.9.4.1 Forecasting Using an Autoregressive Model AR(p).... 53
2.9.4.2 Forecasting Using a Moving-average Model MA(q)... 54
2.9.4.3 Forecasting Using an ARMA Model ........................... 54
2.9.4.4 Forecasting Using an ARIMA Model.......................... 56
2.9.4.5 Forecasting Using an CARIMAX Model .................... 57
2.9.5 Forecasting Using Smoothing ................................................... 57
2.9.5.1 Forecasting Using a Simple Moving Average ............. 57
2.9.5.2 Forecasting Using Exponential Smoothing ................. 58
2.9.5.3 Forecasting Using Adaptive Smoothing ...................... 62
2.9.5.4 Combined Forecast ...................................................... 64
2.10 Application Examples............................................................................. 66
2.10.1 Forecasting Nonstationary Processes ........................................ 66
2.10.2 Quality Prediction of Crude Oil ................................................ 67
2.10.3 Production Monitoring and Failure Diagnosis .......................... 68
2.10.4 Tool Wear Monitoring .............................................................. 68
2.10.5 Minimum Variance Control ...................................................... 69
2.10.6 General Predictive Control........................................................ 71
References .............................................................................................. 74
Selected Reading .................................................................................... 74
Part II Basic Intelligent Computational Technologies
3 Neural Networks Approach ........................................................................... 79
3.1 Introduction ............................................................................................ 79
3.2 Basic Network Architecture.................................................................... 80
3.3 Networks Used for Forecasting .............................................................. 84
3.3.1 Multilayer Perceptron Networks ............................................... 84
3.3.2 Radial Basis Function Networks ............................................... 85
3.3.3 Recurrent Networks .................................................................. 87
3.3.4 Counter Propagation Networks ................................................. 92
3.3.5 Probabilistic Neural Networks .................................................. 94
3.4 Network Training Methods..................................................................... 95
3.4.1 Accelerated Backpropagation Algorithm.................................. 99
3.5 Forecasting Methodology ..................................................................... 103
3.5.1 Data Preparation for Forecasting............................................. 104
3.5.2 Determination of Network Architecture.................................. 106
3.5.3 Network Training Strategy...................................................... 112
3.5.4 Training, Stopping and Evaluation.......................................... 116
3.6 Forecasting Using Neural Networks..................................................... 129
3.6.1 Neural Networks versus Traditional Forecasting .................... 129
3.6.2 Combining Neural Networks and Traditional Approaches ..... 131
3.6.3 Nonlinear Combination of Forecasts Using Neural Networks 132
3.6.4 Forecasting of Multivariate Time Series ................................. 136
References ............................................................................................ 137
Selected Reading .................................................................................. 142
4 Fuzzy Logic Approach ................................................................................. 143
4.1 Introduction .......................................................................................... 143
4.2 Fuzzy Sets and Membership Functions ................................................ 144
4.3 Fuzzy Logic Systems ........................................................................... 146
4.3.1 Mamdani Type of Fuzzy Logic Systems................................. 148
4.3.2 Takagi-Sugeno Type of Fuzzy Logic Systems........................ 148
4.3.3 Relational Fuzzy Logic System of Pedrycz............................. 149
4.4 Inferencing the Fuzzy Logic System .................................................... 150
4.4.1 Inferencing a Mamdani-type Fuzzy Model ............................. 150
4.4.2 Inferencing a Takagi-Sugeno-type Fuzzy Model .................... 153
4.4.3 Inferencing a (Pedrycz) Relational Fuzzy Model.................... 154
4.5 Automated Generation of Fuzzy Rule Base.......................................... 157
4.5.1 The Rules Generation Algorithm ............................................ 157
4.5.2 Modifications Proposed for Automated Rules Generation...... 162
4.5.3 Estimation of Takagi-Sugeno Rules’ Consequent
Parameters............................................................................... 166
4.6 Forecasting Time Series Using the Fuzzy Logic Approach.................. 169
4.6.1 Forecasting Chaotic Time Series: An Example....................... 169
4.7 Rules Generation by Clustering............................................................ 173
4.7.1 Fuzzy Clustering Algorithms for Rule Generation.................. 173
4.7.1.1 Elements of Clustering Theory ................................. 174
4.7.1.2 Hard Partition ............................................................ 175
4.7.1.3 Fuzzy Partition........................................................... 177
4.7.2 Fuzzy c-means Clustering ....................................................... 178
4.7.2.1 Fuzzy c-means Algorithm.......................................... 179
4.7.2.1.1 Parameters of Fuzzy c-means Algorithm.... 180
4.7.3 Gustafson-Kessel Algorithm ................................................... 183
4.7.3.1 Gustafson-Kessel Clustering Algorithm.................... 184
4.7.3.1.1 Parameters of Gustafson-Kessel
Algorithm.................................................... 185
4.7.3.1.2 Interpretation of Cluster Covariance
Matrix ......................................................... 185
4.7.4 Identification of Antecedent Parameters by Fuzzy
Clustering ................................................................................ 185
4.7.5 Modelling of a Nonlinear Plant............................................... 187
4.8 Fuzzy Model as Nonlinear Forecasts Combiner................................... 190
4.9 Concluding Remarks ............................................................................ 193
References ............................................................................................ 193
5 Evolutionary Computation .......................................................................... 195
5.1 Introduction .......................................................................................... 195
5.1.1 The Mechanisms of Evolution ................................................ 196
5.1.2 Evolutionary Algorithms......................................................... 196
5.2 Genetic Algorithms............................................................................... 197
5.2.1 Genetic Operators.................................................................... 198
5.2.1.1 Selection .................................................................... 199
5.2.1.2 Reproduction ............................................................. 199
5.2.1.3 Mutation .................................................................... 199
5.2.1.4 Crossover................................................................... 201
5.2.2 Auxiliary Genetic Operators ................................................... 201
5.2.2.1 Fitness Windowing or Scaling................................... 201
5.2.3 Real-coded Genetic Algorithms .............................................. 203
5.2.3.1 Real Genetic Operators.............................................. 204
5.2.3.1.1 Selection Function ...................................... 204
5.2.3.1.2 Crossover Operators for Real-coded
Genetic Algorithms..................................... 205
5.2.3.1.3 Mutation Operators..................................... 205
5.2.4 Forecasting Examples ............................................................. 206
5.3 Genetic Programming........................................................................... 209
5.3.1 Initialization ............................................................................ 210
5.3.2 Execution of Algorithm........................................................... 211
5.3.3 Fitness Measure....................................................................... 211
5.3.4 Improved Genetic Versions..................................................... 211
5.3.5 Applications ............................................................................ 212
5.4 Evolutionary Strategies......................................................................... 212
5.4.1 Applications to Real-world Problems .................................... 213
5.5 Evolutionary Programming .................................................................. 214
5.5.1 Evolutionary Programming Mechanism ................................ 215
5.6 Differential Evolution .......................................................................... 215
5.6.1 First Variant of Differential Evolution (DE1) ......................... 216
5.6.2 Second Variant of Differential Evolution (DE2)..................... 218
References ............................................................................................ 218
Part III Hybrid Computational Technologies
6 Neuro-fuzzy Approach ................................................................................. 223
6.1 Motivation for Technology Merging .................................................... 223
6.2 Neuro-fuzzy Modelling ........................................................................ 224
6.2.1 Fuzzy Neurons ........................................................................ 227
6.2.1.1 AND Fuzzy Neuron................................................... 228
6.2.1.2 OR Fuzzy Neuron...................................................... 229
6.3 Neuro-fuzzy System Selection for Forecasting .................................... 230
6.4 Takagi-Sugeno-type Neuro-fuzzy Network.......................................... 232
6.4.1 Neural Network Representation of Fuzzy Logic Systems....... 233
6.4.2 Training Algorithm for Neuro-fuzzy Network........................ 234
6.4.2.1 Backpropagation Training of Takagi-Sugeno-type
Neuro-fuzzy Network ................................................ 234
6.4.2.2 Improved Backpropagation Training Algorithm ....... 238
6.4.2.3 Levenberg-Marquardt Training Algorithm................ 239
6.4.2.3.1 Computation of Jacobian Matrix ............... 241
6.4.2.4 Adaptive Learning Rate and Oscillation Control ...... 246
6.5 Comparison of Radial Basis Function Network and
Neuro-fuzzy Network .......................................................................... 247
6.6 Comparison of Neural Network and Neuro-fuzzy Network Training .. 248
6.7 Modelling and Identification of Nonlinear Dynamics ......................... 249
6.7.1 Short-term Forecasting of Electrical load ............................... 249
6.7.2 Prediction of Chaotic Time Series........................................... 253
6.7.3 Modelling and Prediction of Wang Data................................. 258
6.8 Other Engineering Application Examples ............................................ 264
6.8.1 Application of Neuro-fuzzy Modelling to
Materials Property Prediction ................................................. 265
6.8.1.1 Property Prediction for C-Mn Steels .......................... 266
6.8.1.2 Property Prediction for C-Mn-Nb Steels .................... 266
6.8.2 Correction of Pyrometer Reading ........................................... 266
6.8.3 Application for Tool Wear Monitoring .................................. 268
6.9 Concluding Remarks ............................................................................ 270
References ............................................................................................ 271
7 Transparent Fuzzy/Neuro-fuzzy Modelling .............................................. 275
7.1 Introduction ......................................................................................... 275
7.2 Model Transparency and Compactness ................................................ 276
7.3 Fuzzy Modelling with Enhanced Transparency.................................... 277
7.3.1 Redundancy in Numerical Data-driven Modelling ................. 277
7.3.2 Compact and Transparent Modelling Scheme ........................ 279
7.4 Similarity Between Fuzzy Sets ............................................................. 281
7.4.1 Similarity Measure .................................................................. 282
7.4.2 Similarity-based Rule Base Simplification ............................. 282
7.5 Simplification of Rule Base.................................................................. 285
7.5.1 Merging Similar Fuzzy Sets.................................................... 287
7.5.2 Removing Irrelevant Fuzzy Sets ............................................. 289
7.5.3 Removing Redundant Inputs................................................... 290
7.5.4 Merging Rules ........................................................................ 290
7.6 Rule Base Simplification Algorithms .................................................. 291
7.6.1 Iterative Merging..................................................................... 292
7.6.2 Similarity Relations................................................................. 294
7.7 Model Competitive Issues: Accuracy versus Complexity .................... 296
7.8 Application Examples........................................................................... 299
7.9 Concluding Remarks ............................................................................ 302
References ............................................................................................ 302
8 Evolving Neural and Fuzzy Systems ........................................................... 305
8.1 Introduction .......................................................................................... 305
8.1.1 Evolving Neural Networks...................................................... 305
8.1.1.1 Evolving Connection Weights................................... 306
8.1.1.2 Evolving the Network Architecture........................... 309
8.1.1.3 Evolving the Pure Network Architecture................... 310
8.1.1.4 Evolving Complete Network ..................................... 311
8.1.1.5 Evolving the Activation Function.............................. 312
8.1.1.6 Application Examples................................................ 313
8.1.2 Evolving Fuzzy Logic Systems............................................... 313
References ............................................................................................ 317
9 Adaptive Genetic Algorithms....................................................................... 321
9.1 Introduction .......................................................................................... 321
9.2 Genetic Algorithm Parameters to Be Adapted...................................... 322
9.3 Probabilistic Control of Genetic Algorithm Parameters ....................... 323
9.4 Adaptation of Population Size .............................................................. 327
9.5 Fuzzy-logic-controlled Genetic Algorithms ......................................... 329
9.6 Concluding Remarks ............................................................................ 330
References ............................................................................................ 330
Part IV Recent Developments
10 State of the Art and Development Trends .................................................. 335
10.1 Introduction .......................................................................................... 335
10.2 Support Vector Machines ..................................................................... 337
10.2.1 Data-dependent Representation............................................... 342
10.2.2 Machine Implementation......................................................... 343
10.2.3 Applications ............................................................................ 344
10.3 Wavelet Networks ................................................................................ 345
10.3.1 Wavelet Theory....................................................................... 345
10.3.2 Wavelet Neural Networks ....................................................... 346
10.3.3 Applications ............................................................................ 349
10.4 Fractally Configured Neural Networks................................................. 350
10.5 Fuzzy Clustering................................................................................... 352
10.5.1 Fuzzy Clustering Using Kohonen Networks........................... 353
10.5.2 Entropy-based Fuzzy Clustering ............................................. 355
10.5.2.1 Entropy Measure for Cluster Estimation ................... 356
10.5.2.1 The Entropy Measure .................................. 356
10.5.2.2 Fuzzy Clustering Based on Entropy Measure............ 358
10.5.2.3 Fuzzy Model Identification Using
Entropy-based Fuzzy Clustering................................ 359
References ............................................................................................ 360
Index .................................................................................................................... 363

Another Neural Network Books
Download

Monday, October 10, 2011

Nature-inspired Methods in Chemometrics Genetic Algorithms and Artificial Neural Networks






CONTENTS
PREFACE vii
LIST OF CONTRIBUTORS xvii
PART I: GENETIC ALGORITHMS 1
CHAPTER 1 3
GENETIC ALGORITHMS AND BEYOND
(Brian T. Luke)
1 Introduction 3
2 Biological systems and the simple genetic algorithm (SGA) 5
3 Why do GAs work? 6
4 Creating a genetic algorithm 7
4.1 Determining a fitness function 7
4.2 The genetic vector 8
4.3 Creating an initial population 13
4.4 Selection schemes 14
4.5 Mating operators 16
4.6 Mutation operators 23
4.7 Maturation operators 25
4.8 Processing offspring 26
4.9 Termination metrics 27
5 Exploration versus exploitation 28
5.1 The genetic vector 29
5.2 The initial population 30
5.3 Selection schemes 31
5.4 Mating operators 33
5.5 Mutation operators 34
5.6 Maturation operators 34
5.7 Processing offspring 34
5.8 Balancing exploration and exploitation 36
6 Other population-based methods 40
6.1 Parallel GA 41
6.2 Adaptive parallel GA 41
6.3 Meta-GA 42
6.4 Messy GA 42
6.5 Delta coding GA 43
6.6 Tabu search and Gibbs sampling 43
6.7 Evolutionary programming 44
6.8 Evolution strategies 44
6.9 Ant colony optimization 45
6.10 Particle swarm optimization 46
7 Conclusions 48
CHAPTER 2 55
HYBRID GENETIC ALGORITHMS
(D. Brynn Hibbert)
1 Introduction 55
2 The approach to hybridization 55
2.1 Levels of interaction 56
2.2 A simple classification 57
3 Why hybridize? 57
4 Detailed examples 59
4.1 Genetic algorithm with local optimizer 59
4.2 Genetic algorithm–artificial neural network hybrid
optimizing quantitative structure–activity relationships 62
4.3 Non-linear partial least squares regression with optimization
of the inner relation function by a genetic algorithm 63
4.4 The use of a clustering algorithm in a genetic algorithm 64
5 Conclusion 66
CHAPTER 3 69
ROBUST SOFT SENSOR DEVELOPMENT USING
GENETIC PROGRAMMING
(Arthur K. Kordon, Guido F. Smits, Alex N. Kalos,
and Elsa M. Jordaan)
1 Introduction 69
2 Soft sensors in industry 71
2.1 Assumptions for soft sensors development 72
2.2 Economic benefits from soft sensors 73
2.3 Soft sensor application areas 74
2.4 Soft sensor vendors 75
3 Requirements for robust soft sensors 76
3.1 Lessons from industrial applications 76
3.2 Design requirements for robust soft sensors 77
4 Selected approaches for effective soft sensors development 80
4.1 Stacked analytical neural networks 80
4.2 Support vector machines 85
5 Genetic programming in soft sensors development 90
5.1 The nature of genetic programming 90
5.2 Solving problems with genetic programming 96
5.3 Advantages of genetic programming in soft
sensors development and implementation 98
6 Integrated methodology 99
6.1 Variable selection by analytical neural networks 100
6.2 Data condensation by support vector machines 101
6.3 Inferential model generation by genetic programming 102
6.4 On-line implementation and model self-assessment 102
7 Soft sensor for emission estimation: a case study 103
8 Conclusions 105
CHAPTER 4 109
GENETIC ALGORITHMS IN MOLECULAR MODELLING:
A REVIEW
(Alessandro Maiocchi)
1 Introduction 109
2 Molecular modelling and genetic algorithms 110
2.1 How to represent molecular structures and their conformations 111
3 Small and medium-sized molecule conformational search 114
4 Constrained conformational space searches 119
4.1 NMR-derived distance constraints 120
4.2 Pharmacophore-derived constraints 121
4.3 Constrained conformational search by chemical feature superposition 122
5 The protein-ligand docking problem 124
5.1 The scoring functions 126
5.2 Protein–ligand docking with genetic algorithms 127
6 Protein structure prediction with genetic algorithms 131
7 Conclusions 134
CHAPTER 5 141
MOBYDIGS: SOFTWARE FOR REGRESSION AND
CLASSIFICATION MODELS BY GENETIC ALGORITHMS
(Roberto Todeschini, Viviana Consonni, Andrea Mauri
and Manuela Pavan)
1 Introduction 141
2 Population definition 143
3 Tabu list 143
4 Random variables 144
5 Parent selection 145
6 Crossover/mutation trade-off 145
7 Selection pressure and crossover/mutation trade-off influence 148
8 RQK fitness functions 151
9 Evolution of the populations 154
10 Model distance 155
11 The software MobyDigs 158
11.1 The data setup 15811.2 GA setup 159
11.3 Population evolution view 161
11.4 Modify a single population evolution 162
11.5 Modify multiple population evolution 163
11.6 Analysis of the final models 164
11.7 Variable frequency analysis 165
11.8 Saving results 166
CHAPTER 6 169
GENETIC ALGORITHM-PLS AS A TOOL FOR
WAVELENGTH SELECTION IN SPECTRAL DATA SETS
(Riccardo Leardi)
1 Introduction 169
2 The problem of variable selection 170
3 GA applied to variable selection 172
3.1 Initiation of population 172
3.2 Reproduction and mutation 173
3.3 Insertion of new chromosomes 173
3.4 Control of replicates 174
3.5 Influence of the different parameters 174
3.6 Check of subsets 175
3.7 Hybridisation with stepwise selection 176
4 Evolution of the genetic algorithm 176
4.1 The application of randomisation tests 176
4.2 The optimisation of a GA run 177
4.3 Why a single run is not enough 177
4.4 How to take into account the autocorrelation
among the spectral variables 178
5 Pretreatment and scaling 181
6 Maximum number of variables 182
7 Examples 183
7.1 Data set Soy 183
7.2 Data set Additives 190
8 Conclusions 194
PART II: ARTIFICIAL NEURAL NETWORKS 197
CHAPTER 7 199
BASICS OF ARTIFICIAL NEURAL NETWORKS
(Jure Zupan)
1 Introduction 199
2 Basic concepts 200
Contents xii2.1 Neuron 200
2.2 Network of neurons 202
3 Error backpropagation ANNs 204
4 Kohonen ANNs 206
4.1 Basic design 206
4.2 Self-organized maps (SOMs) 210
5 Counterpropagation ANNs 213
6 Radial basis function (RBF) networks 216
7 Learning by ANNs 220
8 Applications 223
8.1 Classification 223
8.2 Mapping 224
8.3 Modeling 225
9 Conclusions 226
CHAPTER 8 231
ARTIFICIAL NEURAL NETWORKS IN MOLECULAR
STRUCTURES—PROPERTY STUDIES
(Marjana Novic and Marjan Vracko)
1 Introduction 231
2 Molecular descriptors 231
3 Counter propagation neural network 233
3.1 Architecture of a counter propagation neural network 233
3.2 Learning in the Kohonen and output layers 235
3.3 Counter propagation neural network as a tool in QSAR 236
4 Application in toxicology and drug design 237
4.1 A study of aquatic toxicity for the fathead minnow 237
4.2 A study of aquatic toxicity toward Tetrahymena pyriformis
on a set of 225 phenols 239
4.3 Example of QSAR modeling with receptor dependent descriptors 242
5 Conclusions 252
CHAPTER 9 257
NEURAL NETWORKS FOR THE CALIBRATION
OF VOLTAMMETRIC DATA
(Conrad Bessant and Edward Richards)
1 Introduction 257
2 Electroanalytical data 257
2.1 Amperometry 258
2.2 Pulsed amperometric detection 259
2.3 Voltammetry 259
2.4 Dual pulse staircase voltammetry 259
2.5 Representation of voltammetric data 261
Contents xiii3 Application of artificial neural networks to voltammetric data 261
3.1 Basic approach 262
3.2 Example of ANN calibration of voltammograms 263
3.3 Summary and conclusions 269
4 Genetic algorithms for optimisation of feed forward neural networks 269
4.1 Genes and chromosomes 269
4.2 Choosing parents for the next generation 270
4.3 Results of ANN optimisation by GA 272
4.4 Comparison of optimisation methods 277
5 Conclusions 278
CHAPTER 10 281
NEURAL NETWORKS AND GENETIC ALGORITHMS
APPLICATIONS IN NUCLEAR MAGNETIC
RESONANCE (NMR) SPECTROSCOPY
(Reinhard Meusinger and Uwe Himmelreich)
1 Introduction 281
2 NMR spectroscopy 283
3 Neural networks applications 285
3.1 Classification 286
3.2 Prediction of properties 290
4 Genetic algorithms 303
4.1 Data processing 304
4.2 Structure determination 305
4.3 Structure prediction 308
4.4 Classification 308
4.5 Feature reduction 309
5 Biomedical NMR spectroscopy 309
6 Conclusion 315
CHAPTER 11 323
A QSAR MODEL FOR PREDICTING THE ACUTE
TOXICITY OF PESTICIDES TO GAMMARIDS
(James Devillers)
1 Introduction 323
2 Materials and methods 324
2.1 Toxicity data 324
2.2 Molecular descriptors 324
2.3 Statistical analyses 329
3 Results and discussion 330
3.1 PLS model 330
3.2 ANN model 332
4 Conclusions 338
Contents xivCONCLUSION 341
CHAPTER 12 343
APPLYING GENETIC ALGORITHMS AND NEURAL
NETWORKS TO CHEMOMETRIC PROBLEMS
(Brian T. Luke)
1 Introduction 343
2 Structure of the genetic algorithm 345
3 Results for the genetic algorithms 350
4 Structure of the neural network 362
5 Results for the neural network 365
6 Conclusions 373
INDEX 377

Another Neural Network Books
Another Artificial Intelligence Books
Another Genetic Algorithm Books
Download

Sunday, September 18, 2011

Fusion of Neural Networks, Fuzzy Systems and Genetic Algorithms: Industrial Applications






Preface
Chapter 1—Introduction to Neural Networks, Fuzzy Systems, Genetic
Algorithms, and their Fusion
1. Knowledge-Based Information Systems
2. Artificial Neural Networks
3. Evolutionary Computing
4. Fuzzy Logic
5. Fusion
6. Summary
References
Chapter 2—A New Fuzzy-Neural Controller
1. Introduction
2. RBF Based Fuzzy System with Unsupervised Learning
2.1 Fuzzy System Based on RBF
2.2 Coding
2.3 Selection
2.4 Crossover Operator
2.5 Mutation Operator
3. Hierarchical Fuzzy-Neuro Controller Based on Skill Knowledge Database
4. Fuzzy-Neuro Controller for Cart-Pole System
5. Conclusions
References
Chapter 3—Expert Knowledge-Based Direct Frequency Converter Using
Fuzzy Logic Control
1. Introduction
2. XDFC Topology and Operation
3. Space Vector Model of the DFC
4. Expert Knowledge-Based SVM
5. XDFC Control
5.1 XDFC Control Strategy and Operation
5.2 Fuzzy Logic Controller
5.3 Load’s Line Current Control
5.4 Input’s Line Current Control
6. Results
7. Evaluation
8. Conclusion
References
Chapter 4—Design of an Electro-Hydraulic System Using Neuro-Fuzzy
Techniques
1. Introduction
2. The Fuzzy Logic System
2.1 Fuzzification
2.2 Inference Mechanism
2.3 Defuzzification
3. Fuzzy Modeling
4. The Learning Mechanism
4.1 Model Initialization
4.2 The Cluster-Based Algorithm
4.3 Illustrative Example
4.4 The Neuro-Fuzzy Algorithm
5. The Experimental System
5.1 Training Data Generation
6. Neuro-Fuzzy Modeling of the Electro-Hydraulic Actuator
7. The Neuro-Fuzzy Control System
7.1 Experimental Results
8. Conclusion
References
Chapter 5—Neural Fuzzy Based Intelligent Systems and Applications
1. Introduction
2. Advantages and Disadvantages of Fuzzy Logic and Neural Nets
2.1 Advantages of Fuzzy Logic
2.2 Disadvantages of Fuzzy Logic
2.3 Advantages of Neural Nets2.4 Disadvantages of Neural Nets
3. Capabilities of Neural Fuzzy Systems (NFS)
4. Types of Neural Fuzzy Systems
5. Descriptions of a Few Neural Fuzzy Systems
5.1 NeuFuz
5.1.1 Brief Overview
5.1.2 NeuFuz Architecture
5.1.3 Fuzzy Logic Processing
5.2 Recurrent Neural Fuzzy System (RNFS)
5.2.1 Recurrent Neural Net
5.2.2 Temporal Information and Weight Update
5.2.3 Recurrent Fuzzy Logic
5.2.4 Determining the Number of Time Delays
6. Representative Applications
6.1 Motor Control
6.1.1 Choosing the Inputs and Outputs
6.1.2 Data Collection and Training
6.1.3 Rule Evaluation and Optimization
6.1.4 Results and Comparison with the PID Approach
6.2 Toaster Control
6.3 Speech Recognition using RNFS
6.3.1 Small Vocabulary Word Recognition
6.3.2 Training and Testing
7. Conclusion
References
Chapter 6—Vehicle Routing through Simulation of Natural Processes
1. Introduction
2. Vehicle Routing Problems
3. Neural Networks
3.1 Self-Organizing Maps
3.1.1 Vehicle Routing Applications
3.1.2 The Hierarchical Deformable Net
3.2 Feedforward Models
3.2.1 Dynamic vehicle routing and dispatching
3.2.2 Feedforward Neural Network Model with Backpropagation
3.2.3 An Application for a Courier Service
4. Genetic Algorithms
4.1 Genetic clustering
4.1.1 Genetic Sectoring (GenSect)
4.1.2 Genetic Clustering with Geometric Shapes (GenClust)
4.1.3 Real-World Applications
4.2 Decoders4.3 A Nonstandard GA
5. Conclusion
Acknowledgments
References
Chapter 7—Fuzzy Logic and Neural Networks in Fault Detection
1. Introduction
2. Fault Diagnosis
2.1 Concept of Fault Diagnosis
2.2 Different Approaches for Residual Generation and Residual Evaluation
3. Fuzzy Logic in Fault Detection
3.1 A Fuzzy Filter for Residual Evaluation
3.1.1 Structure of the Fuzzy Filter
3.1.2 Supporting Algorithm for the Design of the Fuzzy Filter
3.2 Application of the Fuzzy Filter to a Wastewater Plant
3.2.1 Description of the Process
3.2.2 Design of the Fuzzy Filter for Residual Evaluation
3.2.3 Simulation Results
4. Neural Networks in Fault Detection
4.1 Neural Networks for Residual Generation
4.1.1 Radial-Basis-Function(RBF) Neural Networks
4.1.2 Recurrent Neural Networks (RNN)
4.2 Neural Networks for Residual Evaluation
4.2.1 Restricted-Coulomb-Energy (RCE) Neural Networks
4.3 Application to the Industrial Actuator Benchmark Test
4.3.1 Simulation Results for Residual Generation
4.3.2 Simulation Results for Residual Evaluation
5. Conclusions
References
Chapter 8—Application of the Neural Network and Fuzzy Logic to the
Rotating Machine Diagnosis
1. Introduction
2. Rotating Machine Diagnosis
2.1 Fault Diagnosis Technique for Rotating Machines
3. Application of Neural Networks and Fuzzy Logic for Rotating Machine Diagnosis
3.1 Fault Diagnosis Using a Neural Network
3.2 Fault Diagnosis Using Fuzzy Logic
4. Conclusion
References
Chapter 9—Fuzzy Expert Systems in ATM Networks
1. Introduction

Another Neural Network Books
Another Fuzzy Logic Books
Another Genetic Algorithm Books
Download

Friday, April 15, 2011

Algorithms and Architectures






Contents
Contributors xv
Preface xix
Statistical Theories of Learning in Radial Basis
Function Networks
Jason A. S. Freeman, Mark }. L Orr, and David Saad
I. Introduction 1
A. Radial Basis Function Network 2
II. Learning in Radial Basis Function Networks 4
A. Supervised Learning 4
B. Linear Models 5
C. Bias and Variance 9
D. Cross-Validation 11
E. Ridge Regression 13
F. Forward Selection 17
G. Conclusion 19
III. Theoretical Evaluations of Network Performance 21
A. Bayesian and Statistical Mechanics Approaches 21
B. Probably Approximately Correct Framework 31
C. Approximation Error/Estimation Error 37
D. Conclusion 39
IV. Fully Adaptive Training—An Exact Analysis 40
A. On-Line Learning in Radial Basis
Function Networks 41
B. Generalization Error and System Dynamics 42
C. Numerical Solutions 43
D. Phenomenological Observations 45
E. Symmetric Phase 47
F. Convergence Phase 49
G. Quantifying the Variances 50
H. Simulations 52
I. Conclusion 52
V. Summary 54
Appendix 55
References 57
Synthesis of Three-Layer Threshold Networks
Jung Hwan Kim, Sung-Kwon Park, Hyunseo Oh, and Youngnam Han
I. Introduction 62
II. Preliminaries 63
III. Finding the Hidden Layer 64
IV. Learning an Output Layer 73
V. Examples 77
A. Approximation of a Circular Region 77
B. Parity Function 80
C. 7-Bit Function 83
VI. Discussion 84
VII. Conclusion 85
References 86
Weight Initialization Techniques
Mikko Lehtokangas, Petri Salmela, Jukka Saarinen, and Kimmo Kaski
I. Introduction 87
II. Feedforward Neural Network Models 89
A. Multilayer Perceptron Networks 89
B. Radial Basis Function Networks 90
III. Stepwise Regression for Weight Initialization 90
IV. Initialization of Multilayer Perceptron Networks 92
A. Orthogonal Least Squares Method 92
B. Maximum Covariance Method 93
C. Benchmark Experiments 93
y. Initial Training for Radial Basis Function Networks 98
A. Stepwise Hidden Node Selection 98
B. Benchmark Experiments 99
VI. Weight Initialization in Speech
Recognition Application 103
A. Speech Signals and Recognition 103
B. Principle of the Classifier 104
C. Training the Hybrid Classifier 106
D. Results 109
VII. Conclusion 116
Appendix I: Chessboard 4 X 4 116
Appendix II: Two Spirals 117
Appendix III: GaAs MESFET 117
Appendix IV: Credit Card 117
References 118
Fast Computation in Hamming and Hopfield Networks
Isaac Meilijson, Eytan Ruppin, and Moshe Sipper
I. General Introduction 123
II. Threshold Hamming Networks 124
A. Introduction 124
B. Threshold Hamming Network 126
C. Hamming Network and an Optimal Threshold
Hamming Network 128
D. Numerical Results 132
E. Final Remarks 134
III. Two-Iteration Optimal Signaling in
Hopfield Networks 135
A. Introduction 135
B. Model 137
C. Rationale for Nonmonotone Bayesian Signaling
D. Performance 142
E. Optimal Signaling and Performance 146
F. Results 148
G. Discussion 151
IV. Concluding Remarks 152
References 153
Multilevel Neurons
/. Si and A. N. Michel
I. Introduction 155
II. Neural System Analysis 157
A. Neuron Models 158
B. Neural Networks 160
C. Stability of an Equilibrium 162
D. Global Stability Results 164
III. Neural System Synthesis for Associative Memories 167
A. System Constraints 168
B. Synthesis Procedure 170
IV. Simulations 171
V. Conclusions and Discussions 173
Appendix 173
References 178
Probabilistic Design
Sumio Watanabe and Kenji Fukumizu
I. Introduction 181
II. Unified Framework of Neural Networks 182
A. Definition 182
B. Learning in Artificial Neural Networks 185
III. Probabilistic Design of Layered Neural Networks 189
A. Neural Network That Finds Unknown Inputs 189
B. Neural Network That Can Tell the Reliability of Its
Own Inference 192
C. Neural Network That Can Illustrate Input Patterns for a
Given Category 196
IV. Probability Competition Neural Networks 197
A. Probability Competition Neural Network Model and Its
Properties 198
B. Learning Algorithms for a Probability Competition
Neural Network 203
C. Applications of the Probability Competition
Neural Network Model 210
V. Statistical Techniques for Neural Network Design 218
A. Information Criterion for the Steepest Descent 218
B. Active Learning 225
VI. Conclusion 228
References 228
Short Time Memory Problems
M. Daniel Tom and Manoel Fernando Tenorio
I. Introduction 231
II. Background 232
III. Measuring Neural Responses 233
IV. Hysteresis Model 234
V. Perfect Memory 237
VI. Temporal Precedence Differentiation 239
VII. Study in Spatiotemporal Pattern Recognition 241
VIII. Conclusion 245
Appendix 246
References 260
Reliability Issue and Quantization Effects in Optical
and Electronic Network Implementations of
Hebbian-Type Associative Memories
Pau-Choo Chung and Ching-Tsorng Tsai
I. Introduction 261
II. Hebbian-Type Associative Memories 264
A. Linear-Order Associative Memories 264
B. Quadratic-Order Associative Memories 266
III. Network Analysis Using a Signal-to-Noise
Ratio Concept 266
IV. Reliability Effects in Network Implementations 268
A. Open-Circuit Effects 269
B. Short-Circuit Effects 274
y. Comparison of Linear and Quadratic Networks 278
VI. Quantization of Synaptic Interconnections 281
A. Three-Level Quantization 282
B. Three-Level Quantization with
Conserved Interconnections 286
VII. Conclusions 288
References 289
Finite Constraint Satisfaction
Angela Monfroglio
I. Constrained Heuristic Search and Neural Networks for Finite
Constraint Satisfaction Problems 293
A. Introduction 293
B. Shared Resource Allocation Algorithm 295
C. Satisfaction of a Conjunctive Normal Form 300
D. Connectionist Networks for Solving ^-Conjunctive Normal
Form Satisfiability Problems 305
E. Other Connectionist Paradigms 311
F. Network Performance Summary 317
II. Linear Programming and Neural Networks 323
A. Conjunctive Normal Form Satisfaction and
Linear Programming 324
B. Connectionist Networks That Learn to Choose the
Position of Pivot Operations 329
III. Neural Networks and Genetic Algorithms 331
A. Neural Network 332
B. Genetic Algorithm for Optimizing the
Neural Network 336
C. Comparison with Conventional Linear Programming
Algorithms and Standard Constraint Propagation and
Search Techniques 337
D. Testing Data Base 340
IV. Related Work, Limitations, Further Work,
and Conclusions 341
Appendix I. Formal Description of the Shared Resource
Allocation Algorithm 342
Appendix II. Formal Description of the Conjunctive Normal
Form Satisfiability Algorithm 346
A. Discussion 348
Appendix III. A 3-CNF-SAT Example 348
Appendix IV. Outline of Proof for the Linear
Programming Algorithm 350
A. Preliminary Considerations 350
B. Interior Point Methods 357
C. Correctness and Completeness 358
References 359
Parallel, Self-Organizing, Hierarchical Neural
Network Systems
O. K. Ersoy
I. Introduction 364
II. Nonlinear Transformations of Input Vectors 366
A. Binary Input Data 366
B. Analog Input Data 366
C. Other Transformations 367
III. Training, Testing, and Error-Detection Bounds 367
A. Training 367
B. Testing 368
C. Detection of Potential Errors 368
IV. Interpretation of the Error-Detection Bounds 371
V. Comparison between the Parallel, Self-Organizing,
Hierarchical Neural Network, the Backpropagation Network,
and the Maximum Likelihood Method 373
A. Normally Distributed Data 374
B. Uniformly Distributed Data 379
VI. PNS Modules 379
VTI. Parallel Consensual Neural Networks
A. Consensus Theory 382
B. Implementation 383
C. Optimal Weights 384
D. Experimental Results 385
VIII. Parallel, Self-Organizing, Hierarchical Neural Networks with
Competitive Learning and Safe Rejection Schemes 385
A. Safe Rejection Schemes 387
B. Training 389
C. Testing 390
D. Experimental Results 392
IX. Parallel, Self-Organizing, Hierarchical Neural Networks with
Continuous Inputs and Outputs 392
A. Learning of Input Nonlinearities by
Revised Backpropagation 393
B. Forward-Backward Training 394
X. Recent Applications 395
A. Fuzzy Input Signal Representation 395
B. Multiresolution Image Compression 397
XL Conclusions 399
References 399
Dynamics of Networks of Biological Neurons:
Simulation and Experimental Tools
M. Bove, M. Giugliano, M. Grattarola, S. Martinoia, and G. Massobrio
I. Introduction 402
11. Modeling Tools 403
A. Conductance-Based Single-Compartment Differential
Model Neurons 403
B. Integrate-and-Fire Model Neurons 409
C. Synaptic Modeling 412
III. Arrays of Planar Microtransducers for Electrical Activity
Recording of Cultured Neuronal Populations 418
A. Neuronal Cell Cultures Growing on Substrate
Planar Microtransducers 419
B. Example of a Multisite Electrical Signal Recording from
Neuronal Cultures by Using Planar Microtransducer
Arrays and Its Simulations 420
VI. Concluding Remarks 421
References 422
Estimating the Dimensions of Manifolds Using
Delaunay Diagrams
Yun-Chung Chu
I. Delaunay Diagrams of Manifolds 425
II. Estimating the Dimensions of Manifolds 435
III. Conclusions 455
References 456
Index 457


Another Neural Network Books
Another The Core of CS Books Books
Download
Related Posts with Thumbnails

Put Your Ads Here!