Daniel@0: function [extra, invhess] = fevbayes(net, y, a, x, t, x_test, invhess) Daniel@0: %FEVBAYES Evaluate Bayesian regularisation for network forward propagation. Daniel@0: % Daniel@0: % Description Daniel@0: % EXTRA = FEVBAYES(NET, Y, A, X, T, X_TEST) takes a network data Daniel@0: % structure NET together with a set of hidden unit activations A from Daniel@0: % test inputs X_TEST, training data inputs X and T and outputs a matrix Daniel@0: % of extra information EXTRA that consists of error bars (variance) for Daniel@0: % a regression problem or moderated outputs for a classification Daniel@0: % problem. The optional argument (and return value) INVHESS is the Daniel@0: % inverse of the network Hessian computed on the training data inputs Daniel@0: % and targets. Passing it in avoids recomputing it, which can be a Daniel@0: % significant saving for large training sets. Daniel@0: % Daniel@0: % This is called by network-specific functions such as MLPEVFWD which Daniel@0: % are needed since the return values (predictions and hidden unit Daniel@0: % activations) for different network types are in different orders (for Daniel@0: % good reasons). Daniel@0: % Daniel@0: % See also Daniel@0: % MLPEVFWD, RBFEVFWD, GLMEVFWD Daniel@0: % Daniel@0: Daniel@0: % Copyright (c) Ian T Nabney (1996-2001) Daniel@0: Daniel@0: w = netpak(net); Daniel@0: g = netderiv(w, net, x_test); Daniel@0: if nargin < 7 Daniel@0: % Need to compute inverse hessian Daniel@0: hess = nethess(w, net, x, t); Daniel@0: invhess = inv(hess); Daniel@0: end Daniel@0: Daniel@0: ntest = size(x_test, 1); Daniel@0: var = zeros(ntest, 1); Daniel@0: for idx = 1:1:net.nout, Daniel@0: for n = 1:1:ntest, Daniel@0: grad = squeeze(g(n,:,idx)); Daniel@0: var(n,idx) = grad*invhess*grad'; Daniel@0: end Daniel@0: end Daniel@0: Daniel@0: switch net.outfn Daniel@0: case 'linear' Daniel@0: % extra is variance Daniel@0: extra = ones(size(var))./net.beta + var; Daniel@0: case 'logistic' Daniel@0: % extra is moderated output Daniel@0: kappa = 1./(sqrt(ones(size(var)) + (pi.*var)./8)); Daniel@0: extra = 1./(1 + exp(-kappa.*a)); Daniel@0: case 'softmax' Daniel@0: % Use extended Mackay formula; beware that this may not Daniel@0: % be very accurate Daniel@0: kappa = 1./(sqrt(ones(size(var)) + (pi.*var)./8)); Daniel@0: temp = exp(kappa.*a); Daniel@0: extra = temp./(sum(temp, 2)*ones(1, net.nout)); Daniel@0: otherwise Daniel@0: error(['Unknown activation function ', net.outfn]); Daniel@0: end