Daniel@0: Daniel@0:
Daniel@0:Daniel@0: h = mlphess(net, x, t) Daniel@0: [h, hdata] = mlphess(net, x, t) Daniel@0: h = mlphess(net, x, t, hdata) Daniel@0:Daniel@0: Daniel@0: Daniel@0:
h = mlphess(net, x, t)
takes an MLP network data structure net
,
Daniel@0: a matrix x
of input values, and a matrix t
of target
Daniel@0: values and returns the full Hessian matrix h
corresponding to
Daniel@0: the second derivatives of the negative log posterior distribution,
Daniel@0: evaluated for the current weight and bias values as defined by
Daniel@0: net
.
Daniel@0:
Daniel@0: [h, hdata] = mlphess(net, x, t)
returns both the Hessian matrix
Daniel@0: h
and the contribution hdata
arising from the data dependent
Daniel@0: term in the Hessian.
Daniel@0:
Daniel@0:
h = mlphess(net, x, t, hdata)
takes a network data structure
Daniel@0: net
, a matrix x
of input values, and a matrix t
of
Daniel@0: target values, together with the contribution hdata
arising from
Daniel@0: the data dependent term in the Hessian, and returns the full Hessian
Daniel@0: matrix h
corresponding to the second derivatives of the negative
Daniel@0: log posterior distribution. This version saves computation time if
Daniel@0: hdata
has already been evaluated for the current weight and bias
Daniel@0: values.
Daniel@0:
Daniel@0:
Daniel@0: Daniel@0: h = beta*hd + alpha*I Daniel@0:Daniel@0: Daniel@0: where the contribution
hd
is evaluated by calls to mlphdotv
and
Daniel@0: h
is the full Hessian.
Daniel@0:
Daniel@0: mlp
, hesschek
, mlphdotv
, evidence
Copyright (c) Ian T Nabney (1996-9) Daniel@0: Daniel@0: Daniel@0: Daniel@0: