annotate toolboxes/FullBNT-1.0.7/nethelp3.3/glm.htm @ 0:cc4b1211e677 tip

initial commit to HG from Changeset: 646 (e263d8a21543) added further path and more save "camirversion.m"
author Daniel Wolff
date Fri, 19 Aug 2016 13:07:06 +0200
parents
children
rev   line source
Daniel@0 1 <html>
Daniel@0 2 <head>
Daniel@0 3 <title>
Daniel@0 4 Netlab Reference Manual glm
Daniel@0 5 </title>
Daniel@0 6 </head>
Daniel@0 7 <body>
Daniel@0 8 <H1> glm
Daniel@0 9 </H1>
Daniel@0 10 <h2>
Daniel@0 11 Purpose
Daniel@0 12 </h2>
Daniel@0 13 Create a generalized linear model.
Daniel@0 14
Daniel@0 15 <p><h2>
Daniel@0 16 Synopsis
Daniel@0 17 </h2>
Daniel@0 18 <PRE>
Daniel@0 19 net = glm(nin, nout, func)
Daniel@0 20 net = glm(nin, nout, func, prior)
Daniel@0 21 net = glm(nin, nout, func, prior, beta)
Daniel@0 22 </PRE>
Daniel@0 23
Daniel@0 24
Daniel@0 25 <p><h2>
Daniel@0 26 Description
Daniel@0 27 </h2>
Daniel@0 28
Daniel@0 29 <p><CODE>net = glm(nin, nout, func)</CODE> takes the number of inputs
Daniel@0 30 and outputs for a generalized linear model, together
Daniel@0 31 with a string <CODE>func</CODE> which specifies the output unit activation function,
Daniel@0 32 and returns a data structure <CODE>net</CODE>. The weights are drawn from a zero mean,
Daniel@0 33 isotropic Gaussian, with variance scaled by the fan-in of the
Daniel@0 34 output units. This makes use of the Matlab function
Daniel@0 35 <CODE>randn</CODE> and so the seed for the random weight initialization can be
Daniel@0 36 set using <CODE>randn('state', s)</CODE> where <CODE>s</CODE> is the seed value. The optional
Daniel@0 37 argument <CODE>alpha</CODE> sets the inverse variance for the weight
Daniel@0 38 initialization.
Daniel@0 39
Daniel@0 40 <p>The fields in <CODE>net</CODE> are
Daniel@0 41 <PRE>
Daniel@0 42 type = 'glm'
Daniel@0 43 nin = number of inputs
Daniel@0 44 nout = number of outputs
Daniel@0 45 nwts = total number of weights and biases
Daniel@0 46 actfn = string describing the output unit activation function:
Daniel@0 47 'linear'
Daniel@0 48 'logistic'
Daniel@0 49 'softmax'
Daniel@0 50 w1 = first-layer weight matrix
Daniel@0 51 b1 = first-layer bias vector
Daniel@0 52 </PRE>
Daniel@0 53
Daniel@0 54
Daniel@0 55 <p><CODE>net = glm(nin, nout, func, prior)</CODE>, in which <CODE>prior</CODE> is
Daniel@0 56 a scalar, allows the field
Daniel@0 57 <CODE>net.alpha</CODE> in the data structure <CODE>net</CODE> to be set, corresponding
Daniel@0 58 to a zero-mean isotropic Gaussian prior with inverse variance with
Daniel@0 59 value <CODE>prior</CODE>. Alternatively, <CODE>prior</CODE> can consist of a data
Daniel@0 60 structure with fields <CODE>alpha</CODE> and <CODE>index</CODE>, allowing individual
Daniel@0 61 Gaussian priors to be set over groups of weights in the network. Here
Daniel@0 62 <CODE>alpha</CODE> is a column vector in which each element corresponds to a
Daniel@0 63 separate group of weights, which need not be mutually exclusive. The
Daniel@0 64 membership of the groups is defined by the matrix <CODE>index</CODE> in which
Daniel@0 65 the columns correspond to the elements of <CODE>alpha</CODE>. Each column has
Daniel@0 66 one element for each weight in the matrix, in the order defined by the
Daniel@0 67 function <CODE>glmpak</CODE>, and each element is 1 or 0 according to whether
Daniel@0 68 the weight is a member of the corresponding group or not.
Daniel@0 69
Daniel@0 70 <p><CODE>net = glm(nin, nout, func, prior, beta)</CODE> also sets the
Daniel@0 71 additional field <CODE>net.beta</CODE> in the data structure <CODE>net</CODE>, where
Daniel@0 72 beta corresponds to the inverse noise variance.
Daniel@0 73
Daniel@0 74 <p><h2>
Daniel@0 75 See Also
Daniel@0 76 </h2>
Daniel@0 77 <CODE><a href="glmpak.htm">glmpak</a></CODE>, <CODE><a href="glmunpak.htm">glmunpak</a></CODE>, <CODE><a href="glmfwd.htm">glmfwd</a></CODE>, <CODE><a href="glmerr.htm">glmerr</a></CODE>, <CODE><a href="glmgrad.htm">glmgrad</a></CODE>, <CODE><a href="glmtrain.htm">glmtrain</a></CODE><hr>
Daniel@0 78 <b>Pages:</b>
Daniel@0 79 <a href="index.htm">Index</a>
Daniel@0 80 <hr>
Daniel@0 81 <p>Copyright (c) Ian T Nabney (1996-9)
Daniel@0 82
Daniel@0 83
Daniel@0 84 </body>
Daniel@0 85 </html>