Daniel@0: Daniel@0:
Daniel@0:Daniel@0: extra = fevbayes(net, y, a, x, t, x_test) Daniel@0: [extra, invhess] = fevbayes(net, y, a, x, t, x_test, invhess) Daniel@0:Daniel@0: Daniel@0: Daniel@0:
extra = fevbayes(net, y, a, x, t, x_test)
takes a network data structure
Daniel@0: net
together with a set of hidden unit activations a
from
Daniel@0: test inputs x_test
, training data inputs x
and t
and
Daniel@0: outputs a matrix of extra information extra
that consists of
Daniel@0: error bars (variance)
Daniel@0: for a regression problem or moderated outputs for a classification problem.
Daniel@0: The optional argument (and return value)
Daniel@0: invhess
is the inverse of the network Hessian
Daniel@0: computed on the training data inputs and targets. Passing it in avoids
Daniel@0: recomputing it, which can be a significant saving for large training sets.
Daniel@0:
Daniel@0: This is called by network-specific functions such as mlpevfwd
which
Daniel@0: are needed since the return values (predictions and hidden unit activations)
Daniel@0: for different network types are in different orders (for good reasons).
Daniel@0:
Daniel@0:
mlpevfwd
, rbfevfwd
, glmevfwd
Copyright (c) Ian T Nabney (1996-9) Daniel@0: Daniel@0: Daniel@0: Daniel@0: