Mercurial > hg > camir-aes2014
comparison toolboxes/FullBNT-1.0.7/nethelp3.3/demmlp1.htm @ 0:e9a9cd732c1e tip
first hg version after svn
author | wolffd |
---|---|
date | Tue, 10 Feb 2015 15:05:51 +0000 |
parents | |
children |
comparison
equal
deleted
inserted
replaced
-1:000000000000 | 0:e9a9cd732c1e |
---|---|
1 <html> | |
2 <head> | |
3 <title> | |
4 Netlab Reference Manual demmlp1 | |
5 </title> | |
6 </head> | |
7 <body> | |
8 <H1> demmlp1 | |
9 </H1> | |
10 <h2> | |
11 Purpose | |
12 </h2> | |
13 Demonstrate simple regression using a multi-layer perceptron | |
14 | |
15 <p><h2> | |
16 Synopsis | |
17 </h2> | |
18 <PRE> | |
19 demmlp1</PRE> | |
20 | |
21 | |
22 <p><h2> | |
23 Description | |
24 </h2> | |
25 The problem consists of one input variable <CODE>x</CODE> and one target variable | |
26 <CODE>t</CODE> with data generated by sampling <CODE>x</CODE> at equal intervals and then | |
27 generating target data by computing <CODE>sin(2*pi*x)</CODE> and adding Gaussian | |
28 noise. A 2-layer network with linear outputs is trained by minimizing a | |
29 sum-of-squares error function using the scaled conjugate gradient optimizer. | |
30 | |
31 <p><h2> | |
32 See Also | |
33 </h2> | |
34 <CODE><a href="mlp.htm">mlp</a></CODE>, <CODE><a href="mlperr.htm">mlperr</a></CODE>, <CODE><a href="mlpgrad.htm">mlpgrad</a></CODE>, <CODE><a href="scg.htm">scg</a></CODE><hr> | |
35 <b>Pages:</b> | |
36 <a href="index.htm">Index</a> | |
37 <hr> | |
38 <p>Copyright (c) Ian T Nabney (1996-9) | |
39 | |
40 | |
41 </body> | |
42 </html> |