comparison toolboxes/FullBNT-1.0.7/nethelp3.3/mlpbkp.htm @ 0:e9a9cd732c1e tip

first hg version after svn
author wolffd
date Tue, 10 Feb 2015 15:05:51 +0000
parents
children
comparison
equal deleted inserted replaced
-1:000000000000 0:e9a9cd732c1e
1 <html>
2 <head>
3 <title>
4 Netlab Reference Manual mlpbkp
5 </title>
6 </head>
7 <body>
8 <H1> mlpbkp
9 </H1>
10 <h2>
11 Purpose
12 </h2>
13 Backpropagate gradient of error function for 2-layer network.
14
15 <p><h2>
16 Synopsis
17 </h2>
18 <PRE>
19 g = mlpbkp(net, x, z, deltas)</PRE>
20
21
22 <p><h2>
23 Description
24 </h2>
25 <CODE>g = mlpbkp(net, x, z, deltas)</CODE> takes a network data structure
26 <CODE>net</CODE> together with a matrix <CODE>x</CODE> of input vectors, a matrix
27 <CODE>z</CODE> of hidden unit activations, and a matrix <CODE>deltas</CODE> of the
28 gradient of the error function with respect to the values of the
29 output units (i.e. the summed inputs to the output units, before the
30 activation function is applied). The return value is the gradient
31 <CODE>g</CODE> of the error function with respect to the network
32 weights. Each row of <CODE>x</CODE> corresponds to one input vector.
33
34 <p>This function is provided so that the common backpropagation algorithm
35 can be used by multi-layer perceptron network models to compute
36 gradients for mixture density networks as well as standard error
37 functions.
38
39 <p><h2>
40 See Also
41 </h2>
42 <CODE><a href="mlp.htm">mlp</a></CODE>, <CODE><a href="mlpgrad.htm">mlpgrad</a></CODE>, <CODE><a href="mlpderiv.htm">mlpderiv</a></CODE>, <CODE><a href="mdngrad.htm">mdngrad</a></CODE><hr>
43 <b>Pages:</b>
44 <a href="index.htm">Index</a>
45 <hr>
46 <p>Copyright (c) Ian T Nabney (1996-9)
47
48
49 </body>
50 </html>