comparison toolboxes/FullBNT-1.0.7/netlab3.3/rbffwd.m @ 0:e9a9cd732c1e tip

first hg version after svn
author wolffd
date Tue, 10 Feb 2015 15:05:51 +0000
parents
children
comparison
equal deleted inserted replaced
-1:000000000000 0:e9a9cd732c1e
1 function [a, z, n2] = rbffwd(net, x)
2 %RBFFWD Forward propagation through RBF network with linear outputs.
3 %
4 % Description
5 % A = RBFFWD(NET, X) takes a network data structure NET and a matrix X
6 % of input vectors and forward propagates the inputs through the
7 % network to generate a matrix A of output vectors. Each row of X
8 % corresponds to one input vector and each row of A contains the
9 % corresponding output vector. The activation function that is used is
10 % determined by NET.ACTFN.
11 %
12 % [A, Z, N2] = RBFFWD(NET, X) also generates a matrix Z of the hidden
13 % unit activations where each row corresponds to one pattern. These
14 % hidden unit activations represent the design matrix for the RBF. The
15 % matrix N2 is the squared distances between each basis function centre
16 % and each pattern in which each row corresponds to a data point.
17 %
18 % See also
19 % RBF, RBFERR, RBFGRAD, RBFPAK, RBFTRAIN, RBFUNPAK
20 %
21
22 % Copyright (c) Ian T Nabney (1996-2001)
23
24 % Check arguments for consistency
25 errstring = consist(net, 'rbf', x);
26 if ~isempty(errstring);
27 error(errstring);
28 end
29
30 [ndata, data_dim] = size(x);
31
32 % Calculate squared norm matrix, of dimension (ndata, ncentres)
33 n2 = dist2(x, net.c);
34
35 % Switch on activation function type
36 switch net.actfn
37
38 case 'gaussian' % Gaussian
39 % Calculate width factors: net.wi contains squared widths
40 wi2 = ones(ndata, 1) * (2 .* net.wi);
41
42 % Now compute the activations
43 z = exp(-(n2./wi2));
44
45 case 'tps' % Thin plate spline
46 z = n2.*log(n2+(n2==0));
47
48 case 'r4logr' % r^4 log r
49 z = n2.*n2.*log(n2+(n2==0));
50
51 otherwise
52 error('Unknown activation function in rbffwd')
53 end
54
55 a = z*net.w2 + ones(ndata, 1)*net.b2;