Neural Network Toolbox 6 User S Guide

Transcription

Neural Network Toolbox 6User’s GuideHoward DemuthMark BealeMartin HaganDownloaded from www.Manualslib.com manuals search engine

How to Contact The MathWorksWebNewsgroupwww.mathworks.com/contact TS.html Technical service@mathworks.cominfo@mathworks.comProduct enhancement suggestionsBug reportsDocumentation error reportsOrder status, license renewals, passcodesSales, pricing, and general information508-647-7000 (Phone)508-647-7001 (Fax)The MathWorks, Inc.3 Apple Hill DriveNatick, MA 01760-2098For contact information about worldwide offices, see the MathWorks Web site.Neural Network Toolbox User’s Guide COPYRIGHT 1992–2010 by The MathWorks, Inc.The software described in this document is furnished under a license agreement. The software may be usedor copied only under the terms of the license agreement. No part of this manual may be photocopied or reproduced in any form without prior written consent from The MathWorks, Inc.FEDERAL ACQUISITION: This provision applies to all acquisitions of the Program and Documentation by,for, or through the federal government of the United States. By accepting delivery of the Program orDocumentation, the government hereby agrees that this software or documentation qualifies as commercialcomputer software or commercial computer software documentation as such terms are used or defined inFAR 12.212, DFARS Part 227.72, and DFARS 252.227-7014. Accordingly, the terms and conditions of thisAgreement and only those rights specified in this Agreement, shall pertain to and govern the use,modification, reproduction, release, performance, display, and disclosure of the Program and Documentationby the federal government (or other entity acquiring for or through the federal government) and shallsupersede any conflicting contractual terms or conditions. If this License fails to meet the government'sneeds or is inconsistent in any respect with federal procurement law, the government agrees to return theProgram and Documentation, unused, to The MathWorks, Inc.TrademarksMATLAB and Simulink are registered trademarks of The MathWorks, Inc. Seewww.mathworks.com/trademarks for a list of additional trademarks. Other product or brandnames may be trademarks or registered trademarks of their respective holders.PatentsThe MathWorks products are protected by one or more U.S. patents. Please seewww.mathworks.com/patents for more information.Downloaded from www.Manualslib.com manuals search engine

Revision HistoryJune 1992April 1993January 1997July 1997January 1998September 2000June 2001July 2002January 2003June 2004October 2004October 2004March 2005March 2006September 2006March 2007September 2007March 2008October 2008March 2009September 2009March 2010Downloaded from www.Manualslib.com manuals search engineFirst printingSecond printingThird printingFourth printingFifth printingSixth printingSeventh printingOnline onlyOnline onlyOnline onlyOnline onlyEighth printingOnline onlyOnline onlyNinth printingOnline onlyOnline onlyOnline onlyOnline onlyOnline onlyOnline onlyOnline onlyRevised for Version 3 (Release 11)Revised for Version 4 (Release 12)Minor revisions (Release 12.1)Minor revisions (Release 13)Minor revisions (Release 13SP1)Revised for Version 4.0.3 (Release 14)Revised for Version 4.0.4 (Release 14SP1)Revised for Version 4.0.4Revised for Version 4.0.5 (Release 14SP2)Revised for Version 5.0 (Release 2006a)Minor revisions (Release 2006b)Minor revisions (Release 2007a)Revised for Version 5.1 (Release 2007b)Revised for Version 6.0 (Release 2008a)Revised for Version 6.0.1 (Release 2008b)Revised for Version 6.0.2 (Release 2009a)Revised for Version 6.0.3 (Release 2009b)Revised for Version 6.0.4 (Release 2010a)

AcknowledgmentsThe authors would like to thank the following people:Joe Hicklin of The MathWorks for getting Howard into neural networkresearch years ago at the University of Idaho, for encouraging Howard andMark to write the toolbox, for providing crucial help in getting the first toolboxVersion 1.0 out the door, for continuing to help with the toolbox in many ways,and for being such a good friend.Roy Lurie of The MathWorks for his continued enthusiasm for the possibilitiesfor Neural Network Toolbox software.Mary Ann Freeman for general support and for her leadership of a great team ofpeople we enjoy working with.Rakesh Kumar for cheerfully providing technical and practical help,encouragement, ideas and always going the extra mile for us.Alan LaFleur for facilitating our documentation work.Tara Scott and Stephen Vanreusel for help with testing.Orlando De Jesús of Oklahoma State University for his excellent work indeveloping and programming the dynamic training algorithms described inChapter 6, “Dynamic Networks,” and in programming the neural networkcontrollers described in Chapter 7, “Control Systems.”Martin Hagan, Howard Demuth, and Mark Beale for permission to includevarious problems, demonstrations, and other material from Neural NetworkDesign, January, 1996.Downloaded from www.Manualslib.com manuals search engine

Neural Network Toolbox Design BookThe developers of the Neural Network Toolbox software have written atextbook, Neural Network Design (Hagan, Demuth, and Beale, ISBN0-9717321-0-8). The book presents the theory of neural networks, discussestheir design and application, and makes considerable use of the MATLAB environment and Neural Network Toolbox software. Demonstration programsfrom the book are used in various chapters of this user’s guide. (You can findall the book demonstration programs in the Neural Network Toolbox softwareby typing nnd.)This book can be obtained from John Stovall at (303) 492-3648, or by e-mail atJohn.Stovall@colorado.edu.The Neural Network Design textbook includes: An Instructor’s Manual for those who adopt the book for a class Transparency Masters for class useIf you are teaching a class and want an Instructor’s Manual (with solutions tothe book exercises), contact John Stovall at (303) 492-3648, or by e-mail atJohn.Stovall@colorado.edu.To look at sample chapters of the book and to obtain Transparency Masters, godirectly to the Neural Network Design page athttp://hagan.okstate.edu/nnd.htmlFrom this link, you can obtain sample book chapters in PDF format and youcan download the Transparency Masters by clicking Transparency Masters(3.6MB).You can get the Transparency Masters in PowerPoint or PDF format.Downloaded from www.Manualslib.com manuals search engine

Downloaded from www.Manualslib.com manuals search engine

ContentsGetting Started1Product Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-2Using the Documentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-3Applications for Neural Network Toolbox Software . . . . 1-4Applications in This Toolbox . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-4Business Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-4Fitting a Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-7Defining a Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-7Using Command-Line Functions . . . . . . . . . . . . . . . . . . . . . . . . 1-7Using the Neural Network Fitting Tool GUI . . . . . . . . . . . . . . 1-13Recognizing Patterns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Defining a Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Using Command-Line Functions . . . . . . . . . . . . . . . . . . . . . . .Using the Neural Network PatternRecognition Tool GUI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1-241-241-25Clustering Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Defining a Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Using Command-Line Functions . . . . . . . . . . . . . . . . . . . . . . .Using the Neural Network Clustering Tool GUI . . . . . . . . . . .1-421-421-431-471-31Neuron Model and Network Architectures2Neuron Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Simple Neuron . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Transfer Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Neuron with Vector Input . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2-22-22-32-5iDownloaded from www.Manualslib.com manuals search engine

Network Architectures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-8A Layer of Neurons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-8Multiple Layers of Neurons . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-10Input and Output Processing Functions . . . . . . . . . . . . . . . . . . 2-12Data Structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Simulation with Concurrent Inputs in a Static Network . . . .Simulation with Sequential Inputs in a Dynamic Network . .Simulation with Concurrent Inputs in a Dynamic Network . .2-142-142-152-17Training Styles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Incremental Training (of Adaptive and Other Networks) . . . .Batch Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Training Feedback . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2-202-202-222-25Perceptrons3Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-2Important Perceptron Functions . . . . . . . . . . . . . . . . . . . . . . . . . 3-2Neuron Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-3Perceptron Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-5Creating a Perceptron (newp) . . . . . . . . . . . . . . . . . . . . . . . . . . 3-6Simulation (sim) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-7Initialization (init) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-8Learning Rules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-11Perceptron Learning Rule (learnp) . . . . . . . . . . . . . . . . . . . . 3-12Training (train) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-15Limitations and Cautions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-21iiContentsDownloaded from www.Manualslib.com manuals search engine

Outliers and the Normalized Perceptron Rule . . . . . . . . . . . . . 3-21Graphical User Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Introduction to the GUI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Create a Perceptron Network (nntool) . . . . . . . . . . . . . . . . . . .Train the Perceptron . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Export Perceptron Results to the Workspace . . . . . . . . . . . . . .Clear the Network/Data Window . . . . . . . . . . . . . . . . . . . . . . .Importing from the Command Line . . . . . . . . . . . . . . . . . . . . .Save a Variable to a File and Load It Later . . . . . . . . . . . . . . .3-233-233-233-273-293-303-303-31Linear Filters4Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-2Neuron Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-3Network Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-4Creating a Linear Neuron (newlin) . . . . . . . . . . . . . . . . . . . . . . . 4-4Least Mean Square Error . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-8Linear System Design (newlind) . . . . . . . . . . . . . . . . . . . . . . . . 4-9Linear Networks with Delays . . . . . . . . . . . . . . . . . . . . . . . . . 4-10Tapped Delay Line . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-10Linear Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-10LMS Algorithm (learnwh) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-13Linear Classification (train) . . . . . . . . . . . . . . . . . . . . . . . . . . 4-15Limitations and Cautions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-18Overdetermined Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-18Underdetermined Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-18iiiDownloaded from www.Manualslib.com manuals search engine

Linearly Dependent Vectors . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-18Too Large a Learning Rate . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-19Backpropagation5Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-2Solving a Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-4Improving Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-6Under the Hood . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-6Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-8Feedforward Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-10Simulation (sim) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-14Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-15Backpropagation Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-15Faster Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Variable Learning Rate (traingda, traingdx) . . . . . . . . . . . . . .Resilient Backpropagation (trainrp) . . . . . . . . . . . . . . . . . . . . .Conjugate Gradient Algorithms . . . . . . . . . . . . . . . . . . . . . . . .Line Search Routines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Quasi-Newton Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Levenberg-Marquardt (trainlm) . . . . . . . . . . . . . . . . . . . . . . . .Reduced Memory Levenberg-Marquardt (trainlm) . . . . . . . . .5-195-195-215-225-265-295-305-32Speed and Memory Comparison . . . . . . . . . . . . . . . . . . . . . . . 5-34Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-50Improving Generalization . . . . . . . . . . . . . . . . . . . . . . . . . . . .Early Stopping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Index Data Division (divideind) . . . . . . . . . . . . . . . . . . . . . . . .Random Data Division (dividerand) . . . . . . . . . . . . . . . . . . . . .Block Data Division (divideblock) . . . . . . . . . . . . . . . . . . . . . . .ivContentsDownloaded from www.Manualslib.com manuals search engine5-525-535-545-545-54

Interleaved Data Division (divideint) . . . . . . . . . . . . . . . . . . . . 5-55Regularization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-55Summary and Discussion of Early Stoppingand Regularization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-58Preprocessing and Postprocessing . . . . . . . . . . . . . . . . . . . . .Min and Max (mapminmax) . . . . . . . . . . . . . . . . . . . . . . . . . . .Mean and Stand. Dev. (mapstd) . . . . . . . . . . . . . . . . . . . . . . . .Principal Component Analysis (processpca) . . . . . . . . . . . . . . .Processing Unknown Inputs (fixunknowns) . . . . . . . . . . . . . . .Representing Unknown or Don’t Care Targets . . . . . . . . . . . .Posttraining Analysis (postreg) . . . . . . . . . . . . . . . . . . . . . . . . .5-615-625-635-645-655-665-66Sample Training Session . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-68Limitations and Cautions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-71Dynamic Networks6Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Examples of Dynamic Networks . . . . . . . . . . . . . . . . . . . . . . . . .Applications of Dynamic Networks . . . . . . . . . . . . . . . . . . . . . . .Dynamic Network Structures . . . . . . . . . . . . . . . . . . . . . . . . . . .Dynamic Network Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6-26-26-76-86-9Focused Time-Delay Neural Network (newfftd) . . . . . . . . . 6-11Distributed Time-Delay Neural Network (newdtdnn) . . . . 6-15NARX Network (newnarx, newnarxsp, sp2narx) . . . . . . . . 6-18Layer-Recurrent Network (newlrn) . . . . . . . . . . . . . . . . . . . . 6-24vDownloaded from www.Manualslib.com manuals search engine

Control Systems7Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-2NN Predictive Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .System Identification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Predictive Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Using the NN Predictive Controller Block . . . . . . . . . . . . . . . . .7-47-47-57-6NARMA-L2 (Feedback Linearization) Control . . . . . . . . . .Identification of the NARMA-L2 Model . . . . . . . . . . . . . . . . . .NARMA-L2 Controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Using the NARMA-L2 Controller Block . . . . . . . . . . . . . . . . . .7-147-147-167-18Model Reference Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-23Using the Model Reference Controller Block . . . . . . . . . . . . . . 7-25Importing and Exporting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-31Importing and Exporting Networks . . . . . . . . . . . . . . . . . . . . . 7-31Importing and Exporting Training Data . . . . . . . . . . . . . . . . . 7-35Radial Basis Networks8Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-2Important Radial Basis Functions . . . . . . . . . . . . . . . . . . . . . . . 8-2Radial Basis Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Neuron Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Network Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Exact Design (newrbe) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .More Efficient Design (newrb) . . . . . . . . . . . . . . . . . . . . . . . . . .Demonstrations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .8-38-38-48-58-78-8Probabilistic Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . 8-9Network Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-9viContentsDownloaded from www.Manualslib.com manuals search engine

Design (newpnn) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-10Generalized Regression Networks . . . . . . . . . . . . . . . . . . . . . 8-12Network Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-12Design (newgrnn) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-14Self-Organizing and LearningVector Quantization Nets9Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-2Important Self-Organizing and LVQ Functions . . . . . . . . . . . . . 9-2Competitive Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Creating a Competitive Neural Network (newc) . . . . . . . . . . . .Kohonen Learning Rule (learnk) . . . . . . . . . . . . . . . . . . . . . . . . .Bias Learning Rule (learncon) . . . . . . . . . . . . . . . . . . . . . . . . . . .Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Graphical Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .9-39-39-49-59-59-69-7Self-Organizing Feature Maps . . . . . . . . . . . . . . . . . . . . . . . . . . 9-9Topologies (gridtop, hextop, randtop) . . . . . . . . . . . . . . . . . . . . 9-10Distance Functions (dist, linkdist, mandist, boxdist) . . . . . . . 9-14Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-17Creating a Self-Organizing MAP Neural Network (newsom) . 9-18Training (learnsomb) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-19Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9-22Learning Vector Quantization Networks . . . . . . . . . . . . . . .Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Creating an LVQ Network (newlvq) . . . . . . . . . . . . . . . . . . . . .LVQ1 Learning Rule (learnlv1) . . . . . . . . . . . . . . . . . . . . . . . . .Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Supplemental LVQ2.1 Learning Rule (learnlv2) . . . . . . . . . . .9-359-359-369-399-409-42viiDownloaded from www.Manualslib.com manuals search engine

Adaptive Filters and Adaptive Training10Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-2Important Adaptive Functions . . . . . . . . . . . . . . . . . . . . . . . . . 10-2Linear Neuron Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-3Adaptive Linear Network Architecture . . . . . . . . . . . . . . . . 10-4Single ADALINE (newlin) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-4Least Mean Square Error . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-7LMS Algorithm (learnwh) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-8Adaptive Filtering (adapt) . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-9Tapped Delay Line . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-9Adaptive Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-9Adaptive Filter Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-10Prediction Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10-13Noise Cancellation Example . . . . . . . . . . . . . . . . . . . . . . . . . . 10-14Multiple Neuron Adaptive Filters . . . . . . . . . . . . . . . . . . . . . . 10-16Applications11Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-2Application Scripts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-2Applin1: Linear Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Problem Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Network Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Network Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Thoughts and Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . .11-311-311-411-411-6Applin2: Adaptive Prediction . . . . . . . . . . . . . . . . . . . . . . . . . 11-7Problem Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-7viii ContentsDownloaded from www.Manualslib.com manuals search engine

Network Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-8Network Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-8Network Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-8Thoughts and Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-10Appelm1: Amplitude Detection . . . . . . . . . . . . . . . . . . . . . . .Problem Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Network Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Network Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Network Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Network Generalization . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Improving Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .11-1111-1111-1111-1211-1211-1311-14Appcr1: Character Recognition . . . . . . . . . . . . . . . . . . . . . . .Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .System Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .11-1511-1511-1611-18Advanced Topics12Custom Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-2Custom Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-2Network Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-3Network Behavior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-12Additional Toolbox Functions . . . . . . . . . . . . . . . . . . . . . . . . 12-15Custom Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-16Historical Networks13Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-2ixDownloaded from www.Manualslib.com manuals search engine

Important Recurrent Network Functions . . . . . . . . . . . . . . . . . 13-2Elman Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Creating an Elman Network (newelm) . . . . . . . . . . . . . . . . . . .Training an Elman Network . . . . . . . . . . . . . . . . . . . . . . . . . . .13-313-313-413-5Hopfield Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-8Fundamentals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-8Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-8Design (newhop) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-10Network Object Reference14Network Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-2Architecture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-2Subobject Structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-5Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-7Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-9Weight and Bias Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-10Other . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14-12Subobject Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Inputs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Outputs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Biases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Input Weights . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .Layer Weights . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .xContentsDownloaded from www.Manualslib.com manuals search engine14-1314-1314-1514-2014-2114-2214-24

Function Reference15Analysis Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-3Distance Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-4Graphical Interface Functions . . . . . . . . . . . . . . . . . . . . . . . . 15-5Layer Initialization Functions . . . . . . . . . . . . . . . . . . . . . . . . 15-6Learning Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-7Line Search Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-8Net Input Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-9Network Initialization Function . . . . . . . . . . . . . . . . . . . . . . 15-10Network Use Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-11New Networks Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-12Performance Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-13Plotting Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-14Processing Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-15Simulink Support Function . . . . . . . . . . . . . . . . . . . . . . . . . 15-16Topology Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-17Training Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-18Transfer Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-19Utility Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-20xiDownloaded from www.Manualslib.com manuals search engine

Vector Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-21Weight and Bias Initialization Functions . . . . . . . . . . . . . . 15-22Weight Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-23Transfer Function Graphs . . . . . . . . . . . . . . . . . . . . . . . . . . . 15-24Functions — Alphabetical List16Mathematical NotationAMathematical Notation for Equations and Figures . . . . . . .Basic Concepts . . . . . . . . . . . . . . . . . . . .

Neural Network Toolbox Design Book The developers of the Neural Networ k Toolbox software have written a textbook, Neural Network Design (Hagan, Demuth, and Beale, ISBN 0-9717321-0-8). The book presents the theory of neural networks, discusses their design