Mercurial > hg > camir-aes2014
comparison toolboxes/FullBNT-1.0.7/nethelp3.3/somtrain.htm @ 0:e9a9cd732c1e tip
first hg version after svn
author | wolffd |
---|---|
date | Tue, 10 Feb 2015 15:05:51 +0000 |
parents | |
children |
comparison
equal
deleted
inserted
replaced
-1:000000000000 | 0:e9a9cd732c1e |
---|---|
1 <html> | |
2 <head> | |
3 <title> | |
4 Netlab Reference Manual somtrain | |
5 </title> | |
6 </head> | |
7 <body> | |
8 <H1> somtrain | |
9 </H1> | |
10 <h2> | |
11 Purpose | |
12 </h2> | |
13 Kohonen training algorithm for SOM. | |
14 | |
15 <p><h2> | |
16 Synopsis | |
17 </h2> | |
18 <PRE> | |
19 | |
20 net = somtrain{net, options, x) | |
21 </PRE> | |
22 | |
23 | |
24 <p><h2> | |
25 Description | |
26 </h2> | |
27 <CODE>net = somtrain{net, options, x)</CODE> uses Kohonen's algorithm to | |
28 train a SOM. Both on-line and batch algorithms are implemented. | |
29 The learning rate (for on-line) and neighbourhood size decay linearly. | |
30 There is no error function minimised during training (so there is | |
31 no termination criterion other than the number of epochs), but the | |
32 sum-of-squares is computed and returned in <CODE>options(8)</CODE>. | |
33 | |
34 <p>The optional parameters have the following interpretations. | |
35 | |
36 <p><CODE>options(1)</CODE> is set to 1 to display error values; also logs learning | |
37 rate <CODE>alpha</CODE> and neighbourhood size <CODE>nsize</CODE>. | |
38 Otherwise nothing is displayed. | |
39 | |
40 <p><CODE>options(5)</CODE> determines whether the patterns are sampled randomly | |
41 with replacement. If it is 0 (the default), then patterns are sampled | |
42 in order. This is only relevant to the on-line algorithm. | |
43 | |
44 <p><CODE>options(6)</CODE> determines if the on-line or batch algorithm is | |
45 used. If it is 1 | |
46 then the batch algorithm is used. If it is 0 | |
47 (the default) then the on-line algorithm is used. | |
48 | |
49 <p><CODE>options(14)</CODE> is the maximum number of iterations (passes through | |
50 the complete pattern set); default 100. | |
51 | |
52 <p><CODE>options(15)</CODE> is the final neighbourhood size; default value is the | |
53 same as the initial neighbourhood size. | |
54 | |
55 <p><CODE>options(16)</CODE> is the final learning rate; default value is the same | |
56 as the initial learning rate. | |
57 | |
58 <p><CODE>options(17)</CODE> is the initial neighbourhood size; default 0.5*maximum | |
59 map size. | |
60 | |
61 <p><CODE>options(18)</CODE> is the initial learning rate; default 0.9. This parameter | |
62 must be positive. | |
63 | |
64 <p><h2> | |
65 Examples | |
66 </h2> | |
67 The following example performs on-line training on a SOM in two stages: | |
68 ordering and convergence. | |
69 <PRE> | |
70 | |
71 net = som(nin, [8, 7]); | |
72 options = foptions; | |
73 | |
74 <p>% Ordering phase | |
75 options(1) = 1; | |
76 options(14) = 50; | |
77 options(18) = 0.9; % Initial learning rate | |
78 options(16) = 0.05; % Final learning rate | |
79 options(17) = 8; % Initial neighbourhood size | |
80 options(15) = 1; % Final neighbourhood size | |
81 net2 = somtrain(net, options, x); | |
82 | |
83 <p>% Convergence phase | |
84 options(14) = 400; | |
85 options(18) = 0.05; | |
86 options(16) = 0.01; | |
87 options(17) = 0; | |
88 options(15) = 0; | |
89 net3 = somtrain(net2, options, x); | |
90 </PRE> | |
91 | |
92 | |
93 <p><h2> | |
94 See Also | |
95 </h2> | |
96 <CODE><a href="kmeans.htm">kmeans</a></CODE>, <CODE><a href="som.htm">som</a></CODE>, <CODE><a href="somfwd.htm">somfwd</a></CODE><hr> | |
97 <b>Pages:</b> | |
98 <a href="index.htm">Index</a> | |
99 <hr> | |
100 <p>Copyright (c) Ian T Nabney (1996-9) | |
101 | |
102 | |
103 </body> | |
104 </html> |