Daniel@0: %DEMMLP1 Demonstrate simple regression using a multi-layer perceptron Daniel@0: % Daniel@0: % Description Daniel@0: % The problem consists of one input variable X and one target variable Daniel@0: % T with data generated by sampling X at equal intervals and then Daniel@0: % generating target data by computing SIN(2*PI*X) and adding Gaussian Daniel@0: % noise. A 2-layer network with linear outputs is trained by minimizing Daniel@0: % a sum-of-squares error function using the scaled conjugate gradient Daniel@0: % optimizer. Daniel@0: % Daniel@0: % See also Daniel@0: % MLP, MLPERR, MLPGRAD, SCG Daniel@0: % Daniel@0: Daniel@0: % Copyright (c) Ian T Nabney (1996-2001) Daniel@0: Daniel@0: Daniel@0: % Generate the matrix of inputs x and targets t. Daniel@0: Daniel@0: ndata = 20; % Number of data points. Daniel@0: noise = 0.2; % Standard deviation of noise distribution. Daniel@0: x = [0:1/(ndata - 1):1]'; Daniel@0: randn('state', 1); Daniel@0: t = sin(2*pi*x) + noise*randn(ndata, 1); Daniel@0: Daniel@0: clc Daniel@0: disp('This demonstration illustrates the use of a Multi-Layer Perceptron') Daniel@0: disp('network for regression problems. The data is generated from a noisy') Daniel@0: disp('sine function.') Daniel@0: disp(' ') Daniel@0: disp('Press any key to continue.') Daniel@0: pause Daniel@0: Daniel@0: % Set up network parameters. Daniel@0: nin = 1; % Number of inputs. Daniel@0: nhidden = 3; % Number of hidden units. Daniel@0: nout = 1; % Number of outputs. Daniel@0: alpha = 0.01; % Coefficient of weight-decay prior. Daniel@0: Daniel@0: % Create and initialize network weight vector. Daniel@0: Daniel@0: net = mlp(nin, nhidden, nout, 'linear', alpha); Daniel@0: Daniel@0: % Set up vector of options for the optimiser. Daniel@0: Daniel@0: options = zeros(1,18); Daniel@0: options(1) = 1; % This provides display of error values. Daniel@0: options(14) = 100; % Number of training cycles. Daniel@0: Daniel@0: clc Daniel@0: disp(['The network has ', num2str(nhidden), ' hidden units and a weight decay']) Daniel@0: disp(['coefficient of ', num2str(alpha), '.']) Daniel@0: disp(' ') Daniel@0: disp('After initializing the network, we train it use the scaled conjugate') Daniel@0: disp('gradients algorithm for 100 cycles.') Daniel@0: disp(' ') Daniel@0: disp('Press any key to continue') Daniel@0: pause Daniel@0: Daniel@0: % Train using scaled conjugate gradients. Daniel@0: [net, options] = netopt(net, options, x, t, 'scg'); Daniel@0: Daniel@0: disp(' ') Daniel@0: disp('Now we plot the data, underlying function, and network outputs') Daniel@0: disp('on a single graph to compare the results.') Daniel@0: disp(' ') Daniel@0: disp('Press any key to continue.') Daniel@0: pause Daniel@0: Daniel@0: % Plot the data, the original function, and the trained network function. Daniel@0: plotvals = [0:0.01:1]'; Daniel@0: y = mlpfwd(net, plotvals); Daniel@0: fh1 = figure; Daniel@0: plot(x, t, 'ob') Daniel@0: hold on Daniel@0: xlabel('Input') Daniel@0: ylabel('Target') Daniel@0: axis([0 1 -1.5 1.5]) Daniel@0: [fx, fy] = fplot('sin(2*pi*x)', [0 1]); Daniel@0: plot(fx, fy, '-r', 'LineWidth', 2) Daniel@0: plot(plotvals, y, '-k', 'LineWidth', 2) Daniel@0: legend('data', 'function', 'network'); Daniel@0: Daniel@0: disp(' ') Daniel@0: disp('Press any key to end.') Daniel@0: pause Daniel@0: close(fh1); Daniel@0: clear all;