wolffd@0: function [extra, invhess] = fevbayes(net, y, a, x, t, x_test, invhess) wolffd@0: %FEVBAYES Evaluate Bayesian regularisation for network forward propagation. wolffd@0: % wolffd@0: % Description wolffd@0: % EXTRA = FEVBAYES(NET, Y, A, X, T, X_TEST) takes a network data wolffd@0: % structure NET together with a set of hidden unit activations A from wolffd@0: % test inputs X_TEST, training data inputs X and T and outputs a matrix wolffd@0: % of extra information EXTRA that consists of error bars (variance) for wolffd@0: % a regression problem or moderated outputs for a classification wolffd@0: % problem. The optional argument (and return value) INVHESS is the wolffd@0: % inverse of the network Hessian computed on the training data inputs wolffd@0: % and targets. Passing it in avoids recomputing it, which can be a wolffd@0: % significant saving for large training sets. wolffd@0: % wolffd@0: % This is called by network-specific functions such as MLPEVFWD which wolffd@0: % are needed since the return values (predictions and hidden unit wolffd@0: % activations) for different network types are in different orders (for wolffd@0: % good reasons). wolffd@0: % wolffd@0: % See also wolffd@0: % MLPEVFWD, RBFEVFWD, GLMEVFWD wolffd@0: % wolffd@0: wolffd@0: % Copyright (c) Ian T Nabney (1996-2001) wolffd@0: wolffd@0: w = netpak(net); wolffd@0: g = netderiv(w, net, x_test); wolffd@0: if nargin < 7 wolffd@0: % Need to compute inverse hessian wolffd@0: hess = nethess(w, net, x, t); wolffd@0: invhess = inv(hess); wolffd@0: end wolffd@0: wolffd@0: ntest = size(x_test, 1); wolffd@0: var = zeros(ntest, 1); wolffd@0: for idx = 1:1:net.nout, wolffd@0: for n = 1:1:ntest, wolffd@0: grad = squeeze(g(n,:,idx)); wolffd@0: var(n,idx) = grad*invhess*grad'; wolffd@0: end wolffd@0: end wolffd@0: wolffd@0: switch net.outfn wolffd@0: case 'linear' wolffd@0: % extra is variance wolffd@0: extra = ones(size(var))./net.beta + var; wolffd@0: case 'logistic' wolffd@0: % extra is moderated output wolffd@0: kappa = 1./(sqrt(ones(size(var)) + (pi.*var)./8)); wolffd@0: extra = 1./(1 + exp(-kappa.*a)); wolffd@0: case 'softmax' wolffd@0: % Use extended Mackay formula; beware that this may not wolffd@0: % be very accurate wolffd@0: kappa = 1./(sqrt(ones(size(var)) + (pi.*var)./8)); wolffd@0: temp = exp(kappa.*a); wolffd@0: extra = temp./(sum(temp, 2)*ones(1, net.nout)); wolffd@0: otherwise wolffd@0: error(['Unknown activation function ', net.outfn]); wolffd@0: end