comparison toolboxes/FullBNT-1.0.7/nethelp3.3/scg.htm @ 0:e9a9cd732c1e tip

first hg version after svn
author wolffd
date Tue, 10 Feb 2015 15:05:51 +0000
parents
children
comparison
equal deleted inserted replaced
-1:000000000000 0:e9a9cd732c1e
1 <html>
2 <head>
3 <title>
4 Netlab Reference Manual scg
5 </title>
6 </head>
7 <body>
8 <H1> scg
9 </H1>
10 <h2>
11 Purpose
12 </h2>
13 Scaled conjugate gradient optimization.
14
15 <p><h2>
16 Description
17 </h2>
18 <CODE>[x, options] = scg(f, x, options, gradf)</CODE> uses a scaled conjugate
19 gradients
20 algorithm to find a local minimum of the function <CODE>f(x)</CODE> whose
21 gradient is given by <CODE>gradf(x)</CODE>. Here <CODE>x</CODE> is a row vector
22 and <CODE>f</CODE> returns a scalar value.
23 The point at which <CODE>f</CODE> has a local minimum
24 is returned as <CODE>x</CODE>. The function value at that point is returned
25 in <CODE>options(8)</CODE>.
26
27 <p><CODE>[x, options, flog, pointlog, scalelog] = scg(f, x, options, gradf)</CODE>
28 also returns (optionally) a log of the function values
29 after each cycle in <CODE>flog</CODE>, a log
30 of the points visited in <CODE>pointlog</CODE>, and a log of the scale values
31 in the algorithm in <CODE>scalelog</CODE>.
32
33 <p><CODE>scg(f, x, options, gradf, p1, p2, ...)</CODE> allows
34 additional arguments to be passed to <CODE>f()</CODE> and <CODE>gradf()</CODE>.
35
36 The optional parameters have the following interpretations.
37
38 <p><CODE>options(1)</CODE> is set to 1 to display error values; also logs error
39 values in the return argument <CODE>errlog</CODE>, and the points visited
40 in the return argument <CODE>pointslog</CODE>. If <CODE>options(1)</CODE> is set to 0,
41 then only warning messages are displayed. If <CODE>options(1)</CODE> is -1,
42 then nothing is displayed.
43
44 <p><CODE>options(2)</CODE> is a measure of the absolute precision required for the value
45 of <CODE>x</CODE> at the solution. If the absolute difference between
46 the values of <CODE>x</CODE> between two successive steps is less than
47 <CODE>options(2)</CODE>, then this condition is satisfied.
48
49 <p><CODE>options(3)</CODE> is a measure of the precision required of the objective
50 function at the solution. If the absolute difference between the
51 objective function values between two successive steps is less than
52 <CODE>options(3)</CODE>, then this condition is satisfied.
53 Both this and the previous condition must be
54 satisfied for termination.
55
56 <p><CODE>options(9)</CODE> is set to 1 to check the user defined gradient function.
57
58 <p><CODE>options(10)</CODE> returns the total number of function evaluations (including
59 those in any line searches).
60
61 <p><CODE>options(11)</CODE> returns the total number of gradient evaluations.
62
63 <p><CODE>options(14)</CODE> is the maximum number of iterations; default 100.
64
65 <p><h2>
66 Examples
67 </h2>
68 An example of
69 the use of the additional arguments is the minimization of an error
70 function for a neural network:
71 <PRE>
72
73 w = scg('neterr', w, options, 'netgrad', net, x, t);
74 </PRE>
75
76
77 <p><h2>
78 Algorithm
79 </h2>
80 The search direction is re-started after every <CODE>nparams</CODE>
81 successful weight updates where <CODE>nparams</CODE> is the total number of
82 parameters in <CODE>x</CODE>. The algorithm is based on that given by Williams
83 (1991), with a simplified procedure for updating <CODE>lambda</CODE> when
84 <CODE>rho < 0.25</CODE>.
85
86 <p><h2>
87 See Also
88 </h2>
89 <CODE><a href="conjgrad.htm">conjgrad</a></CODE>, <CODE><a href="quasinew.htm">quasinew</a></CODE><hr>
90 <b>Pages:</b>
91 <a href="index.htm">Index</a>
92 <hr>
93 <p>Copyright (c) Ian T Nabney (1996-9)
94
95
96 </body>
97 </html>