annotate toolboxes/FullBNT-1.0.7/netlab3.3/rbfhess.m @ 0:cc4b1211e677 tip

initial commit to HG from Changeset: 646 (e263d8a21543) added further path and more save "camirversion.m"
author Daniel Wolff
date Fri, 19 Aug 2016 13:07:06 +0200
parents
children
rev   line source
Daniel@0 1 function [h, hdata] = rbfhess(net, x, t, hdata)
Daniel@0 2 %RBFHESS Evaluate the Hessian matrix for RBF network.
Daniel@0 3 %
Daniel@0 4 % Description
Daniel@0 5 % H = RBFHESS(NET, X, T) takes an RBF network data structure NET, a
Daniel@0 6 % matrix X of input values, and a matrix T of target values and returns
Daniel@0 7 % the full Hessian matrix H corresponding to the second derivatives of
Daniel@0 8 % the negative log posterior distribution, evaluated for the current
Daniel@0 9 % weight and bias values as defined by NET. Currently, the
Daniel@0 10 % implementation only computes the Hessian for the output layer
Daniel@0 11 % weights.
Daniel@0 12 %
Daniel@0 13 % [H, HDATA] = RBFHESS(NET, X, T) returns both the Hessian matrix H and
Daniel@0 14 % the contribution HDATA arising from the data dependent term in the
Daniel@0 15 % Hessian.
Daniel@0 16 %
Daniel@0 17 % H = RBFHESS(NET, X, T, HDATA) takes a network data structure NET, a
Daniel@0 18 % matrix X of input values, and a matrix T of target values, together
Daniel@0 19 % with the contribution HDATA arising from the data dependent term in
Daniel@0 20 % the Hessian, and returns the full Hessian matrix H corresponding to
Daniel@0 21 % the second derivatives of the negative log posterior distribution.
Daniel@0 22 % This version saves computation time if HDATA has already been
Daniel@0 23 % evaluated for the current weight and bias values.
Daniel@0 24 %
Daniel@0 25 % See also
Daniel@0 26 % MLPHESS, HESSCHEK, EVIDENCE
Daniel@0 27 %
Daniel@0 28
Daniel@0 29 % Copyright (c) Ian T Nabney (1996-2001)
Daniel@0 30
Daniel@0 31 % Check arguments for consistency
Daniel@0 32 errstring = consist(net, 'rbf', x, t);
Daniel@0 33 if ~isempty(errstring);
Daniel@0 34 error(errstring);
Daniel@0 35 end
Daniel@0 36
Daniel@0 37 if nargin == 3
Daniel@0 38 % Data term in Hessian needs to be computed
Daniel@0 39 [a, z] = rbffwd(net, x);
Daniel@0 40 hdata = datahess(net, z, t);
Daniel@0 41 end
Daniel@0 42
Daniel@0 43 % Add in effect of regularisation
Daniel@0 44 [h, hdata] = hbayes(net, hdata);
Daniel@0 45
Daniel@0 46 % Sub-function to compute data part of Hessian
Daniel@0 47 function hdata = datahess(net, z, t)
Daniel@0 48
Daniel@0 49 % Only works for output layer Hessian currently
Daniel@0 50 if (isfield(net, 'mask') & ~any(net.mask(...
Daniel@0 51 1:(net.nwts - net.nout*(net.nhidden+1)))))
Daniel@0 52 hdata = zeros(net.nwts);
Daniel@0 53 ndata = size(z, 1);
Daniel@0 54 out_hess = [z ones(ndata, 1)]'*[z ones(ndata, 1)];
Daniel@0 55 for j = 1:net.nout
Daniel@0 56 hdata = rearrange_hess(net, j, out_hess, hdata);
Daniel@0 57 end
Daniel@0 58 else
Daniel@0 59 error('Output layer Hessian only.');
Daniel@0 60 end
Daniel@0 61 return
Daniel@0 62
Daniel@0 63 % Sub-function to rearrange Hessian matrix
Daniel@0 64 function hdata = rearrange_hess(net, j, out_hess, hdata)
Daniel@0 65
Daniel@0 66 % Because all the biases come after all the input weights,
Daniel@0 67 % we have to rearrange the blocks that make up the network Hessian.
Daniel@0 68 % This function assumes that we are on the jth output and that all outputs
Daniel@0 69 % are independent.
Daniel@0 70
Daniel@0 71 % Start of bias weights block
Daniel@0 72 bb_start = net.nwts - net.nout + 1;
Daniel@0 73 % Start of weight block for jth output
Daniel@0 74 ob_start = net.nwts - net.nout*(net.nhidden+1) + (j-1)*net.nhidden...
Daniel@0 75 + 1;
Daniel@0 76 % End of weight block for jth output
Daniel@0 77 ob_end = ob_start + net.nhidden - 1;
Daniel@0 78 % Index of bias weight
Daniel@0 79 b_index = bb_start+(j-1);
Daniel@0 80 % Put input weight block in right place
Daniel@0 81 hdata(ob_start:ob_end, ob_start:ob_end) = out_hess(1:net.nhidden, ...
Daniel@0 82 1:net.nhidden);
Daniel@0 83 % Put second derivative of bias weight in right place
Daniel@0 84 hdata(b_index, b_index) = out_hess(net.nhidden+1, net.nhidden+1);
Daniel@0 85 % Put cross terms (input weight v bias weight) in right place
Daniel@0 86 hdata(b_index, ob_start:ob_end) = out_hess(net.nhidden+1, ...
Daniel@0 87 1:net.nhidden);
Daniel@0 88 hdata(ob_start:ob_end, b_index) = out_hess(1:net.nhidden, ...
Daniel@0 89 net.nhidden+1);
Daniel@0 90
Daniel@0 91 return