comparison toolboxes/FullBNT-1.0.7/nethelp3.3/index.htm @ 0:e9a9cd732c1e tip

first hg version after svn
author wolffd
date Tue, 10 Feb 2015 15:05:51 +0000
parents
children
comparison
equal deleted inserted replaced
-1:000000000000 0:e9a9cd732c1e
1 <html>
2 <head>
3 <title>
4 NETLAB Reference Documentation
5 </title>
6 </head>
7 <body>
8 <H1> NETLAB Online Reference Documentation </H1>
9 Welcome to the NETLAB online reference documentation.
10 The NETLAB simulation software is designed to provide all the tools necessary
11 for principled and theoretically well founded application development. The
12 NETLAB library is based on the approach and techniques described in <I>Neural
13 Networks for Pattern Recognition </I>(Bishop, 1995). The library includes software
14 implementations of a wide range of data analysis techniques, many of which are
15 not widely available, and are rarely, if ever, included in standard neural
16 network simulation packages.
17 <p>The online reference documentation provides direct hypertext links to specific Netlab function descriptions.
18 <p>If you have any comments or problems to report, please contact Ian Nabney (<a href="mailto:i.t.nabney@aston.ac.uk"><tt>i.t.nabney@aston.ac.uk</tt></a>) or Christopher Bishop (<a href="mailto:c.m.bishop@aston.ac.uk"><tt>c.m.bishop@aston.ac.uk</tt></a>).<H1> Index
19 </H1>
20 An alphabetic list of functions in Netlab.<p>
21 <DL>
22 <DT>
23 <CODE><a href="conffig.htm">conffig</a></CODE><DD>
24 Display a confusion matrix.
25 <DT>
26 <CODE><a href="confmat.htm">confmat</a></CODE><DD>
27 Compute a confusion matrix.
28 <DT>
29 <CODE><a href="conjgrad.htm">conjgrad</a></CODE><DD>
30 Conjugate gradients optimization.
31 <DT>
32 <CODE><a href="consist.htm">consist</a></CODE><DD>
33 Check that arguments are consistent.
34 <DT>
35 <CODE><a href="convertoldnet.htm">convertoldnet</a></CODE><DD>
36 Convert pre-2.3 release MLP and MDN nets to new format
37 <DT>
38 <CODE><a href="datread.htm">datread</a></CODE><DD>
39 Read data from an ascii file.
40 <DT>
41 <CODE><a href="datwrite.htm">datwrite</a></CODE><DD>
42 Write data to ascii file.
43 <DT>
44 <CODE><a href="dem2ddat.htm">dem2ddat</a></CODE><DD>
45 Generates two dimensional data for demos.
46 <DT>
47 <CODE><a href="demard.htm">demard</a></CODE><DD>
48 Automatic relevance determination using the MLP.
49 <DT>
50 <CODE><a href="demev1.htm">demev1</a></CODE><DD>
51 Demonstrate Bayesian regression for the MLP.
52 <DT>
53 <CODE><a href="demev2.htm">demev2</a></CODE><DD>
54 Demonstrate Bayesian classification for the MLP.
55 <DT>
56 <CODE><a href="demev3.htm">demev3</a></CODE><DD>
57 Demonstrate Bayesian regression for the RBF.
58 <DT>
59 <CODE><a href="demgauss.htm">demgauss</a></CODE><DD>
60 Demonstrate sampling from Gaussian distributions.
61 <DT>
62 <CODE><a href="demglm1.htm">demglm1</a></CODE><DD>
63 Demonstrate simple classification using a generalized linear model.
64 <DT>
65 <CODE><a href="demglm2.htm">demglm2</a></CODE><DD>
66 Demonstrate simple classification using a generalized linear model.
67 <DT>
68 <CODE><a href="demgmm1.htm">demgmm1</a></CODE><DD>
69 Demonstrate density modelling with a Gaussian mixture model.
70 <DT>
71 <CODE><a href="demgmm3.htm">demgmm3</a></CODE><DD>
72 Demonstrate density modelling with a Gaussian mixture model.
73 <DT>
74 <CODE><a href="demgmm4.htm">demgmm4</a></CODE><DD>
75 Demonstrate density modelling with a Gaussian mixture model.
76 <DT>
77 <CODE><a href="demgmm5.htm">demgmm5</a></CODE><DD>
78 Demonstrate density modelling with a PPCA mixture model.
79 <DT>
80 <CODE><a href="demgp.htm">demgp</a></CODE><DD>
81 Demonstrate simple regression using a Gaussian Process.
82 <DT>
83 <CODE><a href="demgpard.htm">demgpard</a></CODE><DD>
84 Demonstrate ARD using a Gaussian Process.
85 <DT>
86 <CODE><a href="demgpot.htm">demgpot</a></CODE><DD>
87 Computes the gradient of the negative log likelihood for a mixture model.
88 <DT>
89 <CODE><a href="demgtm1.htm">demgtm1</a></CODE><DD>
90 Demonstrate EM for GTM.
91 <DT>
92 <CODE><a href="demgtm2.htm">demgtm2</a></CODE><DD>
93 Demonstrate GTM for visualisation.
94 <DT>
95 <CODE><a href="demhint.htm">demhint</a></CODE><DD>
96 Demonstration of Hinton diagram for 2-layer feed-forward network.
97 <DT>
98 <CODE><a href="demhmc1.htm">demhmc1</a></CODE><DD>
99 Demonstrate Hybrid Monte Carlo sampling on mixture of two Gaussians.
100 <DT>
101 <CODE><a href="demhmc2.htm">demhmc2</a></CODE><DD>
102 Demonstrate Bayesian regression with Hybrid Monte Carlo sampling.
103 <DT>
104 <CODE><a href="demhmc3.htm">demhmc3</a></CODE><DD>
105 Demonstrate Bayesian regression with Hybrid Monte Carlo sampling.
106 <DT>
107 <CODE><a href="demkmean.htm">demkmean</a></CODE><DD>
108 Demonstrate simple clustering model trained with K-means.
109 <DT>
110 <CODE><a href="demknn1.htm">demknn1</a></CODE><DD>
111 Demonstrate nearest neighbour classifier.
112 <DT>
113 <CODE><a href="demmdn1.htm">demmdn1</a></CODE><DD>
114 Demonstrate fitting a multi-valued function using a Mixture Density Network.
115 <DT>
116 <CODE><a href="demmet1.htm">demmet1</a></CODE><DD>
117 Demonstrate Markov Chain Monte Carlo sampling on a Gaussian.
118 <DT>
119 <CODE><a href="demmlp1.htm">demmlp1</a></CODE><DD>
120 Demonstrate simple regression using a multi-layer perceptron
121 <DT>
122 <CODE><a href="demmlp2.htm">demmlp2</a></CODE><DD>
123 Demonstrate simple classification using a multi-layer perceptron
124 <DT>
125 <CODE><a href="demnlab.htm">demnlab</a></CODE><DD>
126 A front-end Graphical User Interface to the demos
127 <DT>
128 <CODE><a href="demns1.htm">demns1</a></CODE><DD>
129 Demonstrate Neuroscale for visualisation.
130 <DT>
131 <CODE><a href="demolgd1.htm">demolgd1</a></CODE><DD>
132 Demonstrate simple MLP optimisation with on-line gradient descent
133 <DT>
134 <CODE><a href="demopt1.htm">demopt1</a></CODE><DD>
135 Demonstrate different optimisers on Rosenbrock's function.
136 <DT>
137 <CODE><a href="dempot.htm">dempot</a></CODE><DD>
138 Computes the negative log likelihood for a mixture model.
139 <DT>
140 <CODE><a href="demprgp.htm">demprgp</a></CODE><DD>
141 Demonstrate sampling from a Gaussian Process prior.
142 <DT>
143 <CODE><a href="demprior.htm">demprior</a></CODE><DD>
144 Demonstrate sampling from a multi-parameter Gaussian prior.
145 <DT>
146 <CODE><a href="demrbf1.htm">demrbf1</a></CODE><DD>
147 Demonstrate simple regression using a radial basis function network.
148 <DT>
149 <CODE><a href="demsom1.htm">demsom1</a></CODE><DD>
150 Demonstrate SOM for visualisation.
151 <DT>
152 <CODE><a href="demtrain.htm">demtrain</a></CODE><DD>
153 Demonstrate training of MLP network.
154 <DT>
155 <CODE><a href="dist2.htm">dist2</a></CODE><DD>
156 Calculates squared distance between two sets of points.
157 <DT>
158 <CODE><a href="eigdec.htm">eigdec</a></CODE><DD>
159 Sorted eigendecomposition
160 <DT>
161 <CODE><a href="errbayes.htm">errbayes</a></CODE><DD>
162 Evaluate Bayesian error function for network.
163 <DT>
164 <CODE><a href="evidence.htm">evidence</a></CODE><DD>
165 Re-estimate hyperparameters using evidence approximation.
166 <DT>
167 <CODE><a href="fevbayes.htm">fevbayes</a></CODE><DD>
168 Evaluate Bayesian regularisation for network forward propagation.
169 <DT>
170 <CODE><a href="gauss.htm">gauss</a></CODE><DD>
171 Evaluate a Gaussian distribution.
172 <DT>
173 <CODE><a href="gbayes.htm">gbayes</a></CODE><DD>
174 Evaluate gradient of Bayesian error function for network.
175 <DT>
176 <CODE><a href="glm.htm">glm</a></CODE><DD>
177 Create a generalized linear model.
178 <DT>
179 <CODE><a href="glmderiv.htm">glmderiv</a></CODE><DD>
180 Evaluate derivatives of GLM outputs with respect to weights.
181 <DT>
182 <CODE><a href="glmerr.htm">glmerr</a></CODE><DD>
183 Evaluate error function for generalized linear model.
184 <DT>
185 <CODE><a href="glmevfwd.htm">glmevfwd</a></CODE><DD>
186 Forward propagation with evidence for GLM
187 <DT>
188 <CODE><a href="glmfwd.htm">glmfwd</a></CODE><DD>
189 Forward propagation through generalized linear model.
190 <DT>
191 <CODE><a href="glmgrad.htm">glmgrad</a></CODE><DD>
192 Evaluate gradient of error function for generalized linear model.
193 <DT>
194 <CODE><a href="glmhess.htm">glmhess</a></CODE><DD>
195 Evaluate the Hessian matrix for a generalised linear model.
196 <DT>
197 <CODE><a href="glminit.htm">glminit</a></CODE><DD>
198 Initialise the weights in a generalized linear model.
199 <DT>
200 <CODE><a href="glmpak.htm">glmpak</a></CODE><DD>
201 Combines weights and biases into one weights vector.
202 <DT>
203 <CODE><a href="glmtrain.htm">glmtrain</a></CODE><DD>
204 Specialised training of generalized linear model
205 <DT>
206 <CODE><a href="glmunpak.htm">glmunpak</a></CODE><DD>
207 Separates weights vector into weight and bias matrices.
208 <DT>
209 <CODE><a href="gmm.htm">gmm</a></CODE><DD>
210 Creates a Gaussian mixture model with specified architecture.
211 <DT>
212 <CODE><a href="gmmactiv.htm">gmmactiv</a></CODE><DD>
213 Computes the activations of a Gaussian mixture model.
214 <DT>
215 <CODE><a href="gmmem.htm">gmmem</a></CODE><DD>
216 EM algorithm for Gaussian mixture model.
217 <DT>
218 <CODE><a href="gmminit.htm">gmminit</a></CODE><DD>
219 Initialises Gaussian mixture model from data
220 <DT>
221 <CODE><a href="gmmpak.htm">gmmpak</a></CODE><DD>
222 Combines all the parameters in a Gaussian mixture model into one vector.
223 <DT>
224 <CODE><a href="gmmpost.htm">gmmpost</a></CODE><DD>
225 Computes the class posterior probabilities of a Gaussian mixture model.
226 <DT>
227 <CODE><a href="gmmprob.htm">gmmprob</a></CODE><DD>
228 Computes the data probability for a Gaussian mixture model.
229 <DT>
230 <CODE><a href="gmmsamp.htm">gmmsamp</a></CODE><DD>
231 Sample from a Gaussian mixture distribution.
232 <DT>
233 <CODE><a href="gmmunpak.htm">gmmunpak</a></CODE><DD>
234 Separates a vector of Gaussian mixture model parameters into its components.
235 <DT>
236 <CODE><a href="gp.htm">gp</a></CODE><DD>
237 Create a Gaussian Process.
238 <DT>
239 <CODE><a href="gpcovar.htm">gpcovar</a></CODE><DD>
240 Calculate the covariance for a Gaussian Process.
241 <DT>
242 <CODE><a href="gpcovarf.htm">gpcovarf</a></CODE><DD>
243 Calculate the covariance function for a Gaussian Process.
244 <DT>
245 <CODE><a href="gpcovarp.htm">gpcovarp</a></CODE><DD>
246 Calculate the prior covariance for a Gaussian Process.
247 <DT>
248 <CODE><a href="gperr.htm">gperr</a></CODE><DD>
249 Evaluate error function for Gaussian Process.
250 <DT>
251 <CODE><a href="gpfwd.htm">gpfwd</a></CODE><DD>
252 Forward propagation through Gaussian Process.
253 <DT>
254 <CODE><a href="gpgrad.htm">gpgrad</a></CODE><DD>
255 Evaluate error gradient for Gaussian Process.
256 <DT>
257 <CODE><a href="gpinit.htm">gpinit</a></CODE><DD>
258 Initialise Gaussian Process model.
259 <DT>
260 <CODE><a href="gppak.htm">gppak</a></CODE><DD>
261 Combines GP hyperparameters into one vector.
262 <DT>
263 <CODE><a href="gpunpak.htm">gpunpak</a></CODE><DD>
264 Separates hyperparameter vector into components.
265 <DT>
266 <CODE><a href="gradchek.htm">gradchek</a></CODE><DD>
267 Checks a user-defined gradient function using finite differences.
268 <DT>
269 <CODE><a href="graddesc.htm">graddesc</a></CODE><DD>
270 Gradient descent optimization.
271 <DT>
272 <CODE><a href="gsamp.htm">gsamp</a></CODE><DD>
273 Sample from a Gaussian distribution.
274 <DT>
275 <CODE><a href="gtm.htm">gtm</a></CODE><DD>
276 Create a Generative Topographic Map.
277 <DT>
278 <CODE><a href="gtmem.htm">gtmem</a></CODE><DD>
279 EM algorithm for Generative Topographic Mapping.
280 <DT>
281 <CODE><a href="gtmfwd.htm">gtmfwd</a></CODE><DD>
282 Forward propagation through GTM.
283 <DT>
284 <CODE><a href="gtminit.htm">gtminit</a></CODE><DD>
285 Initialise the weights and latent sample in a GTM.
286 <DT>
287 <CODE><a href="gtmlmean.htm">gtmlmean</a></CODE><DD>
288 Mean responsibility for data in a GTM.
289 <DT>
290 <CODE><a href="gtmlmode.htm">gtmlmode</a></CODE><DD>
291 Mode responsibility for data in a GTM.
292 <DT>
293 <CODE><a href="gtmmag.htm">gtmmag</a></CODE><DD>
294 Magnification factors for a GTM
295 <DT>
296 <CODE><a href="gtmpost.htm">gtmpost</a></CODE><DD>
297 Latent space responsibility for data in a GTM.
298 <DT>
299 <CODE><a href="gtmprob.htm">gtmprob</a></CODE><DD>
300 Probability for data under a GTM.
301 <DT>
302 <CODE><a href="hbayes.htm">hbayes</a></CODE><DD>
303 Evaluate Hessian of Bayesian error function for network.
304 <DT>
305 <CODE><a href="hesschek.htm">hesschek</a></CODE><DD>
306 Use central differences to confirm correct evaluation of Hessian matrix.
307 <DT>
308 <CODE><a href="hintmat.htm">hintmat</a></CODE><DD>
309 Evaluates the coordinates of the patches for a Hinton diagram.
310 <DT>
311 <CODE><a href="hinton.htm">hinton</a></CODE><DD>
312 Plot Hinton diagram for a weight matrix.
313 <DT>
314 <CODE><a href="histp.htm">histp</a></CODE><DD>
315 Histogram estimate of 1-dimensional probability distribution.
316 <DT>
317 <CODE><a href="hmc.htm">hmc</a></CODE><DD>
318 Hybrid Monte Carlo sampling.
319 <DT>
320 <CODE><a href="kmeans.htm">kmeans</a></CODE><DD>
321 Trains a k means cluster model.
322 <DT>
323 <CODE><a href="knn.htm">knn</a></CODE><DD>
324 Creates a K-nearest-neighbour classifier.
325 <DT>
326 <CODE><a href="knnfwd.htm">knnfwd</a></CODE><DD>
327 Forward propagation through a K-nearest-neighbour classifier.
328 <DT>
329 <CODE><a href="linef.htm">linef</a></CODE><DD>
330 Calculate function value along a line.
331 <DT>
332 <CODE><a href="linemin.htm">linemin</a></CODE><DD>
333 One dimensional minimization.
334 <DT>
335 <CODE><a href="maxitmess.htm">maxitmess</a></CODE><DD>
336 Create a standard error message when training reaches max. iterations.
337 <DT>
338 <CODE><a href="mdn.htm">mdn</a></CODE><DD>
339 Creates a Mixture Density Network with specified architecture.
340 <DT>
341 <CODE><a href="mdn2gmm.htm">mdn2gmm</a></CODE><DD>
342 Converts an MDN mixture data structure to array of GMMs.
343 <DT>
344 <CODE><a href="mdndist2.htm">mdndist2</a></CODE><DD>
345 Calculates squared distance between centres of Gaussian kernels and data
346 <DT>
347 <CODE><a href="mdnerr.htm">mdnerr</a></CODE><DD>
348 Evaluate error function for Mixture Density Network.
349 <DT>
350 <CODE><a href="mdnfwd.htm">mdnfwd</a></CODE><DD>
351 Forward propagation through Mixture Density Network.
352 <DT>
353 <CODE><a href="mdngrad.htm">mdngrad</a></CODE><DD>
354 Evaluate gradient of error function for Mixture Density Network.
355 <DT>
356 <CODE><a href="mdninit.htm">mdninit</a></CODE><DD>
357 Initialise the weights in a Mixture Density Network.
358 <DT>
359 <CODE><a href="mdnpak.htm">mdnpak</a></CODE><DD>
360 Combines weights and biases into one weights vector.
361 <DT>
362 <CODE><a href="mdnpost.htm">mdnpost</a></CODE><DD>
363 Computes the posterior probability for each MDN mixture component.
364 <DT>
365 <CODE><a href="mdnprob.htm">mdnprob</a></CODE><DD>
366 Computes the data probability likelihood for an MDN mixture structure.
367 <DT>
368 <CODE><a href="mdnunpak.htm">mdnunpak</a></CODE><DD>
369 Separates weights vector into weight and bias matrices.
370 <DT>
371 <CODE><a href="metrop.htm">metrop</a></CODE><DD>
372 Markov Chain Monte Carlo sampling with Metropolis algorithm.
373 <DT>
374 <CODE><a href="minbrack.htm">minbrack</a></CODE><DD>
375 Bracket a minimum of a function of one variable.
376 <DT>
377 <CODE><a href="mlp.htm">mlp</a></CODE><DD>
378 Create a 2-layer feedforward network.
379 <DT>
380 <CODE><a href="mlpbkp.htm">mlpbkp</a></CODE><DD>
381 Backpropagate gradient of error function for 2-layer network.
382 <DT>
383 <CODE><a href="mlpderiv.htm">mlpderiv</a></CODE><DD>
384 Evaluate derivatives of network outputs with respect to weights.
385 <DT>
386 <CODE><a href="mlperr.htm">mlperr</a></CODE><DD>
387 Evaluate error function for 2-layer network.
388 <DT>
389 <CODE><a href="mlpevfwd.htm">mlpevfwd</a></CODE><DD>
390 Forward propagation with evidence for MLP
391 <DT>
392 <CODE><a href="mlpfwd.htm">mlpfwd</a></CODE><DD>
393 Forward propagation through 2-layer network.
394 <DT>
395 <CODE><a href="mlpgrad.htm">mlpgrad</a></CODE><DD>
396 Evaluate gradient of error function for 2-layer network.
397 <DT>
398 <CODE><a href="mlphdotv.htm">mlphdotv</a></CODE><DD>
399 Evaluate the product of the data Hessian with a vector.
400 <DT>
401 <CODE><a href="mlphess.htm">mlphess</a></CODE><DD>
402 Evaluate the Hessian matrix for a multi-layer perceptron network.
403 <DT>
404 <CODE><a href="mlphint.htm">mlphint</a></CODE><DD>
405 Plot Hinton diagram for 2-layer feed-forward network.
406 <DT>
407 <CODE><a href="mlpinit.htm">mlpinit</a></CODE><DD>
408 Initialise the weights in a 2-layer feedforward network.
409 <DT>
410 <CODE><a href="mlppak.htm">mlppak</a></CODE><DD>
411 Combines weights and biases into one weights vector.
412 <DT>
413 <CODE><a href="mlpprior.htm">mlpprior</a></CODE><DD>
414 Create Gaussian prior for mlp.
415 <DT>
416 <CODE><a href="mlptrain.htm">mlptrain</a></CODE><DD>
417 Utility to train an MLP network for demtrain
418 <DT>
419 <CODE><a href="mlpunpak.htm">mlpunpak</a></CODE><DD>
420 Separates weights vector into weight and bias matrices.
421 <DT>
422 <CODE><a href="netderiv.htm">netderiv</a></CODE><DD>
423 Evaluate derivatives of network outputs by weights generically.
424 <DT>
425 <CODE><a href="neterr.htm">neterr</a></CODE><DD>
426 Evaluate network error function for generic optimizers
427 <DT>
428 <CODE><a href="netevfwd.htm">netevfwd</a></CODE><DD>
429 Generic forward propagation with evidence for network
430 <DT>
431 <CODE><a href="netgrad.htm">netgrad</a></CODE><DD>
432 Evaluate network error gradient for generic optimizers
433 <DT>
434 <CODE><a href="nethess.htm">nethess</a></CODE><DD>
435 Evaluate network Hessian
436 <DT>
437 <CODE><a href="netinit.htm">netinit</a></CODE><DD>
438 Initialise the weights in a network.
439 <DT>
440 <CODE><a href="netopt.htm">netopt</a></CODE><DD>
441 Optimize the weights in a network model.
442 <DT>
443 <CODE><a href="netpak.htm">netpak</a></CODE><DD>
444 Combines weights and biases into one weights vector.
445 <DT>
446 <CODE><a href="netunpak.htm">netunpak</a></CODE><DD>
447 Separates weights vector into weight and bias matrices.
448 <DT>
449 <CODE><a href="olgd.htm">olgd</a></CODE><DD>
450 On-line gradient descent optimization.
451 <DT>
452 <CODE><a href="pca.htm">pca</a></CODE><DD>
453 Principal Components Analysis
454 <DT>
455 <CODE><a href="plotmat.htm">plotmat</a></CODE><DD>
456 Display a matrix.
457 <DT>
458 <CODE><a href="ppca.htm">ppca</a></CODE><DD>
459 Probabilistic Principal Components Analysis
460 <DT>
461 <CODE><a href="quasinew.htm">quasinew</a></CODE><DD>
462 Quasi-Newton optimization.
463 <DT>
464 <CODE><a href="rbf.htm">rbf</a></CODE><DD>
465 Creates an RBF network with specified architecture
466 <DT>
467 <CODE><a href="rbfbkp.htm">rbfbkp</a></CODE><DD>
468 Backpropagate gradient of error function for RBF network.
469 <DT>
470 <CODE><a href="rbfderiv.htm">rbfderiv</a></CODE><DD>
471 Evaluate derivatives of RBF network outputs with respect to weights.
472 <DT>
473 <CODE><a href="rbferr.htm">rbferr</a></CODE><DD>
474 Evaluate error function for RBF network.
475 <DT>
476 <CODE><a href="rbfevfwd.htm">rbfevfwd</a></CODE><DD>
477 Forward propagation with evidence for RBF
478 <DT>
479 <CODE><a href="rbffwd.htm">rbffwd</a></CODE><DD>
480 Forward propagation through RBF network with linear outputs.
481 <DT>
482 <CODE><a href="rbfgrad.htm">rbfgrad</a></CODE><DD>
483 Evaluate gradient of error function for RBF network.
484 <DT>
485 <CODE><a href="rbfhess.htm">rbfhess</a></CODE><DD>
486 Evaluate the Hessian matrix for RBF network.
487 <DT>
488 <CODE><a href="rbfjacob.htm">rbfjacob</a></CODE><DD>
489 Evaluate derivatives of RBF network outputs with respect to inputs.
490 <DT>
491 <CODE><a href="rbfpak.htm">rbfpak</a></CODE><DD>
492 Combines all the parameters in an RBF network into one weights vector.
493 <DT>
494 <CODE><a href="rbfprior.htm">rbfprior</a></CODE><DD>
495 Create Gaussian prior and output layer mask for RBF.
496 <DT>
497 <CODE><a href="rbfsetbf.htm">rbfsetbf</a></CODE><DD>
498 Set basis functions of RBF from data.
499 <DT>
500 <CODE><a href="rbfsetfw.htm">rbfsetfw</a></CODE><DD>
501 Set basis function widths of RBF.
502 <DT>
503 <CODE><a href="rbftrain.htm">rbftrain</a></CODE><DD>
504 Two stage training of RBF network.
505 <DT>
506 <CODE><a href="rbfunpak.htm">rbfunpak</a></CODE><DD>
507 Separates a vector of RBF weights into its components.
508 <DT>
509 <CODE><a href="rosegrad.htm">rosegrad</a></CODE><DD>
510 Calculate gradient of Rosenbrock's function.
511 <DT>
512 <CODE><a href="rosen.htm">rosen</a></CODE><DD>
513 Calculate Rosenbrock's function.
514 <DT>
515 <CODE><a href="scg.htm">scg</a></CODE><DD>
516 Scaled conjugate gradient optimization.
517 <DT>
518 <CODE><a href="som.htm">som</a></CODE><DD>
519 Creates a Self-Organising Map.
520 <DT>
521 <CODE><a href="somfwd.htm">somfwd</a></CODE><DD>
522 Forward propagation through a Self-Organising Map.
523 <DT>
524 <CODE><a href="sompak.htm">sompak</a></CODE><DD>
525 Combines node weights into one weights matrix.
526 <DT>
527 <CODE><a href="somtrain.htm">somtrain</a></CODE><DD>
528 Kohonen training algorithm for SOM.
529 <DT>
530 <CODE><a href="somunpak.htm">somunpak</a></CODE><DD>
531 Replaces node weights in SOM.
532 </DL>
533
534 <hr>
535 <p>Copyright (c) Christopher M Bishop, Ian T Nabney (1996, 1997)
536 </body>
537 </html>