annotate toolboxes/FullBNT-1.0.7/netlab3.3/fevbayes.m @ 0:e9a9cd732c1e tip

first hg version after svn
author wolffd
date Tue, 10 Feb 2015 15:05:51 +0000
parents
children
rev   line source
wolffd@0 1 function [extra, invhess] = fevbayes(net, y, a, x, t, x_test, invhess)
wolffd@0 2 %FEVBAYES Evaluate Bayesian regularisation for network forward propagation.
wolffd@0 3 %
wolffd@0 4 % Description
wolffd@0 5 % EXTRA = FEVBAYES(NET, Y, A, X, T, X_TEST) takes a network data
wolffd@0 6 % structure NET together with a set of hidden unit activations A from
wolffd@0 7 % test inputs X_TEST, training data inputs X and T and outputs a matrix
wolffd@0 8 % of extra information EXTRA that consists of error bars (variance) for
wolffd@0 9 % a regression problem or moderated outputs for a classification
wolffd@0 10 % problem. The optional argument (and return value) INVHESS is the
wolffd@0 11 % inverse of the network Hessian computed on the training data inputs
wolffd@0 12 % and targets. Passing it in avoids recomputing it, which can be a
wolffd@0 13 % significant saving for large training sets.
wolffd@0 14 %
wolffd@0 15 % This is called by network-specific functions such as MLPEVFWD which
wolffd@0 16 % are needed since the return values (predictions and hidden unit
wolffd@0 17 % activations) for different network types are in different orders (for
wolffd@0 18 % good reasons).
wolffd@0 19 %
wolffd@0 20 % See also
wolffd@0 21 % MLPEVFWD, RBFEVFWD, GLMEVFWD
wolffd@0 22 %
wolffd@0 23
wolffd@0 24 % Copyright (c) Ian T Nabney (1996-2001)
wolffd@0 25
wolffd@0 26 w = netpak(net);
wolffd@0 27 g = netderiv(w, net, x_test);
wolffd@0 28 if nargin < 7
wolffd@0 29 % Need to compute inverse hessian
wolffd@0 30 hess = nethess(w, net, x, t);
wolffd@0 31 invhess = inv(hess);
wolffd@0 32 end
wolffd@0 33
wolffd@0 34 ntest = size(x_test, 1);
wolffd@0 35 var = zeros(ntest, 1);
wolffd@0 36 for idx = 1:1:net.nout,
wolffd@0 37 for n = 1:1:ntest,
wolffd@0 38 grad = squeeze(g(n,:,idx));
wolffd@0 39 var(n,idx) = grad*invhess*grad';
wolffd@0 40 end
wolffd@0 41 end
wolffd@0 42
wolffd@0 43 switch net.outfn
wolffd@0 44 case 'linear'
wolffd@0 45 % extra is variance
wolffd@0 46 extra = ones(size(var))./net.beta + var;
wolffd@0 47 case 'logistic'
wolffd@0 48 % extra is moderated output
wolffd@0 49 kappa = 1./(sqrt(ones(size(var)) + (pi.*var)./8));
wolffd@0 50 extra = 1./(1 + exp(-kappa.*a));
wolffd@0 51 case 'softmax'
wolffd@0 52 % Use extended Mackay formula; beware that this may not
wolffd@0 53 % be very accurate
wolffd@0 54 kappa = 1./(sqrt(ones(size(var)) + (pi.*var)./8));
wolffd@0 55 temp = exp(kappa.*a);
wolffd@0 56 extra = temp./(sum(temp, 2)*ones(1, net.nout));
wolffd@0 57 otherwise
wolffd@0 58 error(['Unknown activation function ', net.outfn]);
wolffd@0 59 end