Daniel@0: Daniel@0:
Daniel@0:net = glmtrain(net, options, x, t)
uses
Daniel@0: the iterative reweighted least squares (IRLS)
Daniel@0: algorithm to set the weights in the generalized linear model structure
Daniel@0: net
. This is a more efficient alternative to using glmerr
Daniel@0: and glmgrad
and a non-linear optimisation routine through
Daniel@0: netopt
.
Daniel@0: Note that for linear outputs, a single pass through the
Daniel@0: algorithm is all that is required, since the error function is quadratic in
Daniel@0: the weights. The algorithm also handles scalar alpha
and beta
Daniel@0: terms. If you want to use more complicated priors, you should use
Daniel@0: general-purpose non-linear optimisation algorithms.
Daniel@0:
Daniel@0: For logistic and softmax outputs, general priors can be handled, although Daniel@0: this requires the pseudo-inverse of the Hessian, giving up the better Daniel@0: conditioning and some of the speed advantage of the normal form equations. Daniel@0: Daniel@0:
The error function value at the final set of weights is returned
Daniel@0: in options(8)
.
Daniel@0: Each row of x
corresponds to one
Daniel@0: input vector and each row of t
corresponds to one target vector.
Daniel@0:
Daniel@0:
The optional parameters have the following interpretations. Daniel@0: Daniel@0:
options(1)
is set to 1 to display error values during training.
Daniel@0: If options(1)
is set to 0,
Daniel@0: then only warning messages are displayed. If options(1)
is -1,
Daniel@0: then nothing is displayed.
Daniel@0:
Daniel@0:
options(2)
is a measure of the precision required for the value
Daniel@0: of the weights w
at the solution.
Daniel@0:
Daniel@0:
options(3)
is a measure of the precision required of the objective
Daniel@0: function at the solution. Both this and the previous condition must be
Daniel@0: satisfied for termination.
Daniel@0:
Daniel@0:
options(5)
is set to 1 if an approximation to the Hessian (which assumes
Daniel@0: that all outputs are independent) is used for softmax outputs. With the default
Daniel@0: value of 0 the exact Hessian (which is more expensive to compute) is used.
Daniel@0:
Daniel@0:
options(14)
is the maximum number of iterations for the IRLS algorithm;
Daniel@0: default 100.
Daniel@0:
Daniel@0:
glm
, glmerr
, glmgrad
Copyright (c) Ian T Nabney (1996-9) Daniel@0: Daniel@0: Daniel@0: Daniel@0: