Daniel@0: function demopt1(xinit) Daniel@0: %DEMOPT1 Demonstrate different optimisers on Rosenbrock's function. Daniel@0: % Daniel@0: % Description Daniel@0: % The four general optimisers (quasi-Newton, conjugate gradients, Daniel@0: % scaled conjugate gradients, and gradient descent) are applied to the Daniel@0: % minimisation of Rosenbrock's well known `banana' function. Each Daniel@0: % optimiser is run for at most 100 cycles, and a stopping criterion of Daniel@0: % 1.0e-4 is used for both position and function value. At the end, the Daniel@0: % trajectory of each algorithm is shown on a contour plot of the Daniel@0: % function. Daniel@0: % Daniel@0: % DEMOPT1(XINIT) allows the user to specify a row vector with two Daniel@0: % columns as the starting point. The default is the point [-1 1]. Note Daniel@0: % that the contour plot has an x range of [-1.5, 1.5] and a y range of Daniel@0: % [-0.5, 2.1], so it is best to choose a starting point in the same Daniel@0: % region. Daniel@0: % Daniel@0: % See also Daniel@0: % CONJGRAD, GRADDESC, QUASINEW, SCG, ROSEN, ROSEGRAD Daniel@0: % Daniel@0: Daniel@0: % Copyright (c) Ian T Nabney (1996-2001) Daniel@0: Daniel@0: % Initialise start point for search Daniel@0: if nargin < 1 | size(xinit) ~= [1 2] Daniel@0: xinit = [-1 1]; % Traditional start point Daniel@0: end Daniel@0: Daniel@0: % Find out if flops is available (i.e. pre-version 6 Matlab) Daniel@0: v = version; Daniel@0: if (str2num(strtok(v, '.')) >= 6) Daniel@0: flops_works = logical(0); Daniel@0: else Daniel@0: flops_works = logical(1); Daniel@0: end Daniel@0: Daniel@0: % Set up options Daniel@0: options = foptions; % Standard options Daniel@0: options(1) = -1; % Turn off printing completely Daniel@0: options(3) = 1e-8; % Tolerance in value of function Daniel@0: options(14) = 100; % Max. 100 iterations of algorithm Daniel@0: Daniel@0: clc Daniel@0: disp('This demonstration compares the performance of four generic') Daniel@0: disp('optimisation routines when finding the minimum of Rosenbrock''s') Daniel@0: disp('function y = 100*(x2-x1^2)^2 + (1-x1)^2.') Daniel@0: disp(' ') Daniel@0: disp('The global minimum of this function is at [1 1].') Daniel@0: disp(['Each algorithm starts at the point [' num2str(xinit(1))... Daniel@0: ' ' num2str(xinit(2)) '].']) Daniel@0: disp(' ') Daniel@0: disp('Press any key to continue.') Daniel@0: pause Daniel@0: Daniel@0: % Generate a contour plot of the function Daniel@0: a = -1.5:.02:1.5; Daniel@0: b = -0.5:.02:2.1; Daniel@0: [A, B] = meshgrid(a, b); Daniel@0: Z = rosen([A(:), B(:)]); Daniel@0: Z = reshape(Z, length(b), length(a)); Daniel@0: l = -1:6; Daniel@0: v = 2.^l; Daniel@0: fh1 = figure; Daniel@0: contour(a, b, Z, v) Daniel@0: title('Contour plot of Rosenbrock''s function') Daniel@0: hold on Daniel@0: Daniel@0: clc Daniel@0: disp('We now use quasi-Newton, conjugate gradient, scaled conjugate') Daniel@0: disp('gradient, and gradient descent with line search algorithms') Daniel@0: disp('to find a local minimum of this function. Each algorithm is stopped') Daniel@0: disp('when 100 cycles have elapsed, or if the change in function value') Daniel@0: disp('is less than 1.0e-8 or the change in the input vector is less than') Daniel@0: disp('1.0e-4 in magnitude.') Daniel@0: disp(' ') Daniel@0: disp('Press any key to continue.') Daniel@0: pause Daniel@0: Daniel@0: clc Daniel@0: x = xinit; Daniel@0: flops(0) Daniel@0: [x, options, errlog, pointlog] = quasinew('rosen', x, options, 'rosegrad'); Daniel@0: fprintf(1, 'For quasi-Newton method:\n') Daniel@0: fprintf(1, 'Final point is (%f, %f), value is %f\n', x(1), x(2), options(8)) Daniel@0: fprintf(1, 'Number of function evaluations is %d\n', options(10)) Daniel@0: fprintf(1, 'Number of gradient evaluations is %d\n', options(11)) Daniel@0: if flops_works Daniel@0: opt_flops = flops; Daniel@0: fprintf(1, 'Number of floating point operations is %d\n', opt_flops) Daniel@0: end Daniel@0: fprintf(1, 'Number of cycles is %d\n', size(pointlog, 1) - 1); Daniel@0: disp(' ') Daniel@0: Daniel@0: x = xinit; Daniel@0: flops(0) Daniel@0: [x, options, errlog2, pointlog2] = conjgrad('rosen', x, options, 'rosegrad'); Daniel@0: fprintf(1, 'For conjugate gradient method:\n') Daniel@0: fprintf(1, 'Final point is (%f, %f), value is %f\n', x(1), x(2), options(8)) Daniel@0: fprintf(1, 'Number of function evaluations is %d\n', options(10)) Daniel@0: fprintf(1, 'Number of gradient evaluations is %d\n', options(11)) Daniel@0: if flops_works Daniel@0: opt_flops = flops; Daniel@0: fprintf(1, 'Number of floating point operations is %d\n', ... Daniel@0: opt_flops) Daniel@0: end Daniel@0: fprintf(1, 'Number of cycles is %d\n', size(pointlog2, 1) - 1); Daniel@0: disp(' ') Daniel@0: Daniel@0: x = xinit; Daniel@0: flops(0) Daniel@0: [x, options, errlog3, pointlog3] = scg('rosen', x, options, 'rosegrad'); Daniel@0: fprintf(1, 'For scaled conjugate gradient method:\n') Daniel@0: fprintf(1, 'Final point is (%f, %f), value is %f\n', x(1), x(2), options(8)) Daniel@0: fprintf(1, 'Number of function evaluations is %d\n', options(10)) Daniel@0: fprintf(1, 'Number of gradient evaluations is %d\n', options(11)) Daniel@0: if flops_works Daniel@0: opt_flops = flops; Daniel@0: fprintf(1, 'Number of floating point operations is %d\n', opt_flops) Daniel@0: end Daniel@0: fprintf(1, 'Number of cycles is %d\n', size(pointlog3, 1) - 1); Daniel@0: disp(' ') Daniel@0: Daniel@0: x = xinit; Daniel@0: options(7) = 1; % Line minimisation used Daniel@0: flops(0) Daniel@0: [x, options, errlog4, pointlog4] = graddesc('rosen', x, options, 'rosegrad'); Daniel@0: fprintf(1, 'For gradient descent method:\n') Daniel@0: fprintf(1, 'Final point is (%f, %f), value is %f\n', x(1), x(2), options(8)) Daniel@0: fprintf(1, 'Number of function evaluations is %d\n', options(10)) Daniel@0: fprintf(1, 'Number of gradient evaluations is %d\n', options(11)) Daniel@0: if flops_works Daniel@0: opt_flops = flops; Daniel@0: fprintf(1, 'Number of floating point operations is %d\n', opt_flops) Daniel@0: end Daniel@0: fprintf(1, 'Number of cycles is %d\n', size(pointlog4, 1) - 1); Daniel@0: disp(' ') Daniel@0: disp('Note that gradient descent does not reach a local minimum in') Daniel@0: disp('100 cycles.') Daniel@0: disp(' ') Daniel@0: disp('On this problem, where the function is cheap to evaluate, the') Daniel@0: disp('computational effort is dominated by the algorithm overhead.') Daniel@0: disp('However on more complex optimisation problems (such as those') Daniel@0: disp('involving neural networks), computational effort is dominated by') Daniel@0: disp('the number of function and gradient evaluations. Counting these,') Daniel@0: disp('we can rank the algorithms: quasi-Newton (the best), conjugate') Daniel@0: disp('gradient, scaled conjugate gradient, gradient descent (the worst)') Daniel@0: disp(' ') Daniel@0: disp('Press any key to continue.') Daniel@0: pause Daniel@0: clc Daniel@0: disp('We now plot the trajectory of search points for each algorithm') Daniel@0: disp('superimposed on the contour plot.') Daniel@0: disp(' ') Daniel@0: disp('Press any key to continue.') Daniel@0: pause Daniel@0: plot(pointlog4(:,1), pointlog4(:,2), 'bd', 'MarkerSize', 6) Daniel@0: plot(pointlog3(:,1), pointlog3(:,2), 'mx', 'MarkerSize', 6, 'LineWidth', 2) Daniel@0: plot(pointlog(:,1), pointlog(:,2), 'k.', 'MarkerSize', 18) Daniel@0: plot(pointlog2(:,1), pointlog2(:,2), 'g+', 'MarkerSize', 6, 'LineWidth', 2) Daniel@0: lh = legend( 'Gradient Descent', 'Scaled Conjugate Gradients', ... Daniel@0: 'Quasi Newton', 'Conjugate Gradients'); Daniel@0: Daniel@0: hold off Daniel@0: Daniel@0: clc Daniel@0: disp('Press any key to end.') Daniel@0: pause Daniel@0: close(fh1); Daniel@0: clear all;