Daniel@0: %DEMHMC3 Demonstrate Bayesian regression with Hybrid Monte Carlo sampling. Daniel@0: % Daniel@0: % Description Daniel@0: % The problem consists of one input variable X and one target variable Daniel@0: % T with data generated by sampling X at equal intervals and then Daniel@0: % generating target data by computing SIN(2*PI*X) and adding Gaussian Daniel@0: % noise. The model is a 2-layer network with linear outputs, and the Daniel@0: % hybrid Monte Carlo algorithm (with persistence) is used to sample Daniel@0: % from the posterior distribution of the weights. The graph shows the Daniel@0: % underlying function, 300 samples from the function given by the Daniel@0: % posterior distribution of the weights, and the average prediction Daniel@0: % (weighted by the posterior probabilities). Daniel@0: % Daniel@0: % See also Daniel@0: % DEMHMC2, HMC, MLP, MLPERR, MLPGRAD Daniel@0: % Daniel@0: Daniel@0: % Copyright (c) Ian T Nabney (1996-2001) Daniel@0: Daniel@0: Daniel@0: % Generate the matrix of inputs x and targets t. Daniel@0: ndata = 20; % Number of data points. Daniel@0: noise = 0.1; % Standard deviation of noise distribution. Daniel@0: nin = 1; % Number of inputs. Daniel@0: nout = 1; % Number of outputs. Daniel@0: Daniel@0: seed = 42; % Seed for random number generators. Daniel@0: randn('state', seed); Daniel@0: rand('state', seed); Daniel@0: Daniel@0: x = 0.25 + 0.1*randn(ndata, nin); Daniel@0: t = sin(2*pi*x) + noise*randn(size(x)); Daniel@0: Daniel@0: clc Daniel@0: disp('This demonstration illustrates the use of the hybrid Monte Carlo') Daniel@0: disp('algorithm to sample from the posterior weight distribution of a') Daniel@0: disp('multi-layer perceptron.') Daniel@0: disp(' ') Daniel@0: disp('A regression problem is used, with the one-dimensional data drawn') Daniel@0: disp('from a noisy sine function. The x values are sampled from a normal') Daniel@0: disp('distribution with mean 0.25 and variance 0.01.') Daniel@0: disp(' ') Daniel@0: disp('First we initialise the network.') Daniel@0: disp(' ') Daniel@0: disp('Press any key to continue.') Daniel@0: pause Daniel@0: Daniel@0: % Set up network parameters. Daniel@0: nhidden = 5; % Number of hidden units. Daniel@0: alpha = 0.001; % Coefficient of weight-decay prior. Daniel@0: beta = 100.0; % Coefficient of data error. Daniel@0: Daniel@0: % Create and initialize network model. Daniel@0: Daniel@0: % Initialise weights reasonably close to 0 Daniel@0: net = mlp(nin, nhidden, nout, 'linear', alpha, beta); Daniel@0: net = mlpinit(net, 10); Daniel@0: Daniel@0: clc Daniel@0: disp('Next we take 100 samples from the posterior distribution. The first') Daniel@0: disp('300 samples at the start of the chain are omitted. As persistence') Daniel@0: disp('is used, the momentum has a small random component added at each step.') Daniel@0: disp('10 iterations are used at each step (compared with 100 in demhmc2).') Daniel@0: disp('The step size is 0.005 (compared with 0.002).') Daniel@0: disp('The new state is accepted if the threshold') Daniel@0: disp('value is greater than a random number between 0 and 1.') Daniel@0: disp(' ') Daniel@0: disp('Negative step numbers indicate samples discarded from the start of the') Daniel@0: disp('chain.') Daniel@0: disp(' ') Daniel@0: disp('Press any key to continue.') Daniel@0: pause Daniel@0: Daniel@0: % Set up vector of options for hybrid Monte Carlo. Daniel@0: nsamples = 100; % Number of retained samples. Daniel@0: Daniel@0: options = foptions; % Default options vector. Daniel@0: options(1) = 1; % Switch on diagnostics. Daniel@0: options(5) = 1; % Use persistence Daniel@0: options(7) = 10; % Number of steps in trajectory. Daniel@0: options(14) = nsamples; % Number of Monte Carlo samples returned. Daniel@0: options(15) = 300; % Number of samples omitted at start of chain. Daniel@0: options(17) = 0.95; % Alpha value in persistence Daniel@0: options(18) = 0.005; % Step size. Daniel@0: Daniel@0: w = mlppak(net); Daniel@0: % Initialise HMC Daniel@0: hmc('state', 42); Daniel@0: [samples, energies] = hmc('neterr', w, options, 'netgrad', net, x, t); Daniel@0: Daniel@0: clc Daniel@0: disp('The plot shows the underlying noise free function, the 100 samples') Daniel@0: disp('produced from the MLP, and their average as a Monte Carlo estimate') Daniel@0: disp('of the true posterior average.') Daniel@0: disp(' ') Daniel@0: disp('Press any key to continue.') Daniel@0: pause Daniel@0: Daniel@0: nplot = 300; Daniel@0: plotvals = [0 : 1/(nplot - 1) : 1]'; Daniel@0: pred = zeros(size(plotvals)); Daniel@0: fh1 = figure; Daniel@0: hold on Daniel@0: for k = 1:nsamples Daniel@0: w2 = samples(k,:); Daniel@0: net2 = mlpunpak(net, w2); Daniel@0: y = mlpfwd(net2, plotvals); Daniel@0: % Sum predictions Daniel@0: pred = pred + y; Daniel@0: h4 = plot(plotvals, y, '-r', 'LineWidth', 1); Daniel@0: end Daniel@0: pred = pred./nsamples; Daniel@0: % Plot data Daniel@0: h1 = plot(x, t, 'ob', 'LineWidth', 2, 'MarkerFaceColor', 'blue'); Daniel@0: axis([0 1 -3 3]) Daniel@0: Daniel@0: % Plot function Daniel@0: [fx, fy] = fplot('sin(2*pi*x)', [0 1], '--g'); Daniel@0: h2 = plot(fx, fy, '--g', 'LineWidth', 2); Daniel@0: set(gca, 'box', 'on'); Daniel@0: Daniel@0: % Plot averaged prediction Daniel@0: h3 = plot(plotvals, pred, '-c', 'LineWidth', 2); Daniel@0: Daniel@0: lstrings = char('Data', 'Function', 'Prediction', 'Samples'); Daniel@0: legend([h1 h2 h3 h4], lstrings, 3); Daniel@0: hold off Daniel@0: Daniel@0: disp('Note how the predictions become much further from the true function') Daniel@0: disp('away from the region of high data density.') Daniel@0: disp(' ') Daniel@0: disp('Press any key to exit.') Daniel@0: pause Daniel@0: close(fh1); Daniel@0: clear all;