annotate toolboxes/FullBNT-1.0.7/nethelp3.3/demmlp1.htm @ 0:cc4b1211e677 tip

initial commit to HG from Changeset: 646 (e263d8a21543) added further path and more save "camirversion.m"
author Daniel Wolff
date Fri, 19 Aug 2016 13:07:06 +0200
parents
children
rev   line source
Daniel@0 1 <html>
Daniel@0 2 <head>
Daniel@0 3 <title>
Daniel@0 4 Netlab Reference Manual demmlp1
Daniel@0 5 </title>
Daniel@0 6 </head>
Daniel@0 7 <body>
Daniel@0 8 <H1> demmlp1
Daniel@0 9 </H1>
Daniel@0 10 <h2>
Daniel@0 11 Purpose
Daniel@0 12 </h2>
Daniel@0 13 Demonstrate simple regression using a multi-layer perceptron
Daniel@0 14
Daniel@0 15 <p><h2>
Daniel@0 16 Synopsis
Daniel@0 17 </h2>
Daniel@0 18 <PRE>
Daniel@0 19 demmlp1</PRE>
Daniel@0 20
Daniel@0 21
Daniel@0 22 <p><h2>
Daniel@0 23 Description
Daniel@0 24 </h2>
Daniel@0 25 The problem consists of one input variable <CODE>x</CODE> and one target variable
Daniel@0 26 <CODE>t</CODE> with data generated by sampling <CODE>x</CODE> at equal intervals and then
Daniel@0 27 generating target data by computing <CODE>sin(2*pi*x)</CODE> and adding Gaussian
Daniel@0 28 noise. A 2-layer network with linear outputs is trained by minimizing a
Daniel@0 29 sum-of-squares error function using the scaled conjugate gradient optimizer.
Daniel@0 30
Daniel@0 31 <p><h2>
Daniel@0 32 See Also
Daniel@0 33 </h2>
Daniel@0 34 <CODE><a href="mlp.htm">mlp</a></CODE>, <CODE><a href="mlperr.htm">mlperr</a></CODE>, <CODE><a href="mlpgrad.htm">mlpgrad</a></CODE>, <CODE><a href="scg.htm">scg</a></CODE><hr>
Daniel@0 35 <b>Pages:</b>
Daniel@0 36 <a href="index.htm">Index</a>
Daniel@0 37 <hr>
Daniel@0 38 <p>Copyright (c) Ian T Nabney (1996-9)
Daniel@0 39
Daniel@0 40
Daniel@0 41 </body>
Daniel@0 42 </html>