comparison toolboxes/FullBNT-1.0.7/netlab3.3/fevbayes.m @ 0:e9a9cd732c1e tip

first hg version after svn
author wolffd
date Tue, 10 Feb 2015 15:05:51 +0000
parents
children
comparison
equal deleted inserted replaced
-1:000000000000 0:e9a9cd732c1e
1 function [extra, invhess] = fevbayes(net, y, a, x, t, x_test, invhess)
2 %FEVBAYES Evaluate Bayesian regularisation for network forward propagation.
3 %
4 % Description
5 % EXTRA = FEVBAYES(NET, Y, A, X, T, X_TEST) takes a network data
6 % structure NET together with a set of hidden unit activations A from
7 % test inputs X_TEST, training data inputs X and T and outputs a matrix
8 % of extra information EXTRA that consists of error bars (variance) for
9 % a regression problem or moderated outputs for a classification
10 % problem. The optional argument (and return value) INVHESS is the
11 % inverse of the network Hessian computed on the training data inputs
12 % and targets. Passing it in avoids recomputing it, which can be a
13 % significant saving for large training sets.
14 %
15 % This is called by network-specific functions such as MLPEVFWD which
16 % are needed since the return values (predictions and hidden unit
17 % activations) for different network types are in different orders (for
18 % good reasons).
19 %
20 % See also
21 % MLPEVFWD, RBFEVFWD, GLMEVFWD
22 %
23
24 % Copyright (c) Ian T Nabney (1996-2001)
25
26 w = netpak(net);
27 g = netderiv(w, net, x_test);
28 if nargin < 7
29 % Need to compute inverse hessian
30 hess = nethess(w, net, x, t);
31 invhess = inv(hess);
32 end
33
34 ntest = size(x_test, 1);
35 var = zeros(ntest, 1);
36 for idx = 1:1:net.nout,
37 for n = 1:1:ntest,
38 grad = squeeze(g(n,:,idx));
39 var(n,idx) = grad*invhess*grad';
40 end
41 end
42
43 switch net.outfn
44 case 'linear'
45 % extra is variance
46 extra = ones(size(var))./net.beta + var;
47 case 'logistic'
48 % extra is moderated output
49 kappa = 1./(sqrt(ones(size(var)) + (pi.*var)./8));
50 extra = 1./(1 + exp(-kappa.*a));
51 case 'softmax'
52 % Use extended Mackay formula; beware that this may not
53 % be very accurate
54 kappa = 1./(sqrt(ones(size(var)) + (pi.*var)./8));
55 temp = exp(kappa.*a);
56 extra = temp./(sum(temp, 2)*ones(1, net.nout));
57 otherwise
58 error(['Unknown activation function ', net.outfn]);
59 end