Daniel@0: %DEMGPARD Demonstrate ARD using a Gaussian Process. Daniel@0: % Daniel@0: % Description Daniel@0: % The data consists of three input variables X1, X2 and X3, and one Daniel@0: % target variable T. The target data is generated by computing Daniel@0: % SIN(2*PI*X1) and adding Gaussian noise, x2 is a copy of x1 with a Daniel@0: % higher level of added noise, and x3 is sampled randomly from a Daniel@0: % Gaussian distribution. A Gaussian Process, is trained by optimising Daniel@0: % the hyperparameters using the scaled conjugate gradient algorithm. Daniel@0: % The final values of the hyperparameters show that the model Daniel@0: % successfully identifies the importance of each input. Daniel@0: % Daniel@0: % See also Daniel@0: % DEMGP, GP, GPERR, GPFWD, GPGRAD, GPINIT, SCG Daniel@0: % Daniel@0: Daniel@0: % Copyright (c) Ian T Nabney (1996-2001) Daniel@0: Daniel@0: clc; Daniel@0: randn('state', 1729); Daniel@0: rand('state', 1729); Daniel@0: disp('This demonstration illustrates the technique of automatic relevance') Daniel@0: disp('determination (ARD) using a Gaussian Process.') Daniel@0: disp(' '); Daniel@0: disp('First, we set up a synthetic data set involving three input variables:') Daniel@0: disp('x1 is sampled uniformly from the range (0,1) and has a low level of') Daniel@0: disp('added Gaussian noise, x2 is a copy of x1 with a higher level of added') Daniel@0: disp('noise, and x3 is sampled randomly from a Gaussian distribution. The') Daniel@0: disp('single target variable is given by t = sin(2*pi*x1) with additive') Daniel@0: disp('Gaussian noise. Thus x1 is very relevant for determining the target') Daniel@0: disp('value, x2 is of some relevance, while x3 should in principle be') Daniel@0: disp('irrelevant.') Daniel@0: disp(' '); Daniel@0: disp('Press any key to see a plot of t against x1.') Daniel@0: pause; Daniel@0: Daniel@0: ndata = 100; Daniel@0: x1 = rand(ndata, 1); Daniel@0: x2 = x1 + 0.05*randn(ndata, 1); Daniel@0: x3 = 0.5 + 0.5*randn(ndata, 1); Daniel@0: x = [x1, x2, x3]; Daniel@0: t = sin(2*pi*x1) + 0.1*randn(ndata, 1); Daniel@0: Daniel@0: % Plot the data and the original function. Daniel@0: h = figure; Daniel@0: plotvals = linspace(0, 1, 200)'; Daniel@0: plot(x1, t, 'ob') Daniel@0: hold on Daniel@0: xlabel('Input x1') Daniel@0: ylabel('Target') Daniel@0: axis([0 1 -1.5 1.5]) Daniel@0: [fx, fy] = fplot('sin(2*pi*x)', [0 1]); Daniel@0: plot(fx, fy, '-g', 'LineWidth', 2); Daniel@0: legend('data', 'function'); Daniel@0: Daniel@0: disp(' '); Daniel@0: disp('Press any key to continue') Daniel@0: pause; clc; Daniel@0: Daniel@0: disp('The Gaussian Process has a separate hyperparameter for each input.') Daniel@0: disp('The hyperparameters are trained by error minimisation using the scaled.') Daniel@0: disp('conjugate gradient optimiser.') Daniel@0: disp(' '); Daniel@0: disp('Press any key to create and train the model.') Daniel@0: disp(' '); Daniel@0: pause; Daniel@0: Daniel@0: net = gp(3, 'sqexp'); Daniel@0: % Initialise the parameters. Daniel@0: prior.pr_mean = 0; Daniel@0: prior.pr_var = 0.1; Daniel@0: net = gpinit(net, x, t, prior); Daniel@0: Daniel@0: % Now train to find the hyperparameters. Daniel@0: options = foptions; Daniel@0: options(1) = 1; Daniel@0: options(14) = 30; Daniel@0: Daniel@0: [net, options] = netopt(net, options, x, t, 'scg'); Daniel@0: Daniel@0: rel = exp(net.inweights); Daniel@0: Daniel@0: fprintf(1, ... Daniel@0: '\nFinal hyperparameters:\n\n bias:\t\t%10.6f\n noise:\t%10.6f\n', ... Daniel@0: exp(net.bias), exp(net.noise)); Daniel@0: fprintf(1, ' Vertical scale: %8.6f\n', exp(net.fpar(1))); Daniel@0: fprintf(1, ' Input 1:\t%10.6f\n Input 2:\t%10.6f\n', ... Daniel@0: rel(1), rel(2)); Daniel@0: fprintf(1, ' Input 3:\t%10.6f\n\n', rel(3)); Daniel@0: disp(' '); Daniel@0: disp('We see that the inverse lengthscale associated with') Daniel@0: disp('input x1 is large, that of x2 has an intermediate value and the variance') Daniel@0: disp('of weights associated with x3 is small.') Daniel@0: disp(' '); Daniel@0: disp('This implies that the Gaussian Process is giving greatest emphasis') Daniel@0: disp('to x1 and least emphasis to x3, with intermediate emphasis on') Daniel@0: disp('x2 in the covariance function.') Daniel@0: disp(' ') Daniel@0: disp('Since the target t is statistically independent of x3 we might') Daniel@0: disp('expect the weights associated with this input would go to') Daniel@0: disp('zero. However, for any finite data set there may be some chance') Daniel@0: disp('correlation between x3 and t, and so the corresponding hyperparameter remains') Daniel@0: disp('finite.') Daniel@0: disp('Press any key to continue.') Daniel@0: pause Daniel@0: Daniel@0: disp('Finally, we plot the output of the Gaussian Process along the line') Daniel@0: disp('x1 = x2 = x3, together with the true underlying function.') Daniel@0: xt = linspace(0, 1, 50); Daniel@0: xtest = [xt', xt', xt']; Daniel@0: Daniel@0: cn = gpcovar(net, x); Daniel@0: cninv = inv(cn); Daniel@0: [ytest, sigsq] = gpfwd(net, xtest, cninv); Daniel@0: sig = sqrt(sigsq); Daniel@0: Daniel@0: figure(h); hold on; Daniel@0: plot(xt, ytest, '-k'); Daniel@0: plot(xt, ytest+(2*sig), '-b', xt, ytest-(2*sig), '-b'); Daniel@0: axis([0 1 -1.5 1.5]); Daniel@0: fplot('sin(2*pi*x)', [0 1], '--m'); Daniel@0: Daniel@0: disp(' '); Daniel@0: disp('Press any key to end.') Daniel@0: pause; clc; close(h); clear all Daniel@0: