Daniel@0: % Netlab Toolbox Daniel@0: % Version 3.3.1 18-Jun-2004 Daniel@0: % Daniel@0: % conffig - Display a confusion matrix. Daniel@0: % confmat - Compute a confusion matrix. Daniel@0: % conjgrad - Conjugate gradients optimization. Daniel@0: % consist - Check that arguments are consistent. Daniel@0: % convertoldnet- Convert pre-2.3 release MLP and MDN nets to new format Daniel@0: % datread - Read data from an ascii file. Daniel@0: % datwrite - Write data to ascii file. Daniel@0: % dem2ddat - Generates two dimensional data for demos. Daniel@0: % demard - Automatic relevance determination using the MLP. Daniel@0: % demev1 - Demonstrate Bayesian regression for the MLP. Daniel@0: % demev2 - Demonstrate Bayesian classification for the MLP. Daniel@0: % demev3 - Demonstrate Bayesian regression for the RBF. Daniel@0: % demgauss - Demonstrate sampling from Gaussian distributions. Daniel@0: % demglm1 - Demonstrate simple classification using a generalized linear model. Daniel@0: % demglm2 - Demonstrate simple classification using a generalized linear model. Daniel@0: % demgmm1 - Demonstrate density modelling with a Gaussian mixture model. Daniel@0: % demgmm3 - Demonstrate density modelling with a Gaussian mixture model. Daniel@0: % demgmm4 - Demonstrate density modelling with a Gaussian mixture model. Daniel@0: % demgmm5 - Demonstrate density modelling with a PPCA mixture model. Daniel@0: % demgp - Demonstrate simple regression using a Gaussian Process. Daniel@0: % demgpard - Demonstrate ARD using a Gaussian Process. Daniel@0: % demgpot - Computes the gradient of the negative log likelihood for a mixture model. Daniel@0: % demgtm1 - Demonstrate EM for GTM. Daniel@0: % demgtm2 - Demonstrate GTM for visualisation. Daniel@0: % demhint - Demonstration of Hinton diagram for 2-layer feed-forward network. Daniel@0: % demhmc1 - Demonstrate Hybrid Monte Carlo sampling on mixture of two Gaussians. Daniel@0: % demhmc2 - Demonstrate Bayesian regression with Hybrid Monte Carlo sampling. Daniel@0: % demhmc3 - Demonstrate Bayesian regression with Hybrid Monte Carlo sampling. Daniel@0: % demkmean - Demonstrate simple clustering model trained with K-means. Daniel@0: % demknn1 - Demonstrate nearest neighbour classifier. Daniel@0: % demmdn1 - Demonstrate fitting a multi-valued function using a Mixture Density Network. Daniel@0: % demmet1 - Demonstrate Markov Chain Monte Carlo sampling on a Gaussian. Daniel@0: % demmlp1 - Demonstrate simple regression using a multi-layer perceptron Daniel@0: % demmlp2 - Demonstrate simple classification using a multi-layer perceptron Daniel@0: % demnlab - A front-end Graphical User Interface to the demos Daniel@0: % demns1 - Demonstrate Neuroscale for visualisation. Daniel@0: % demolgd1 - Demonstrate simple MLP optimisation with on-line gradient descent Daniel@0: % demopt1 - Demonstrate different optimisers on Rosenbrock's function. Daniel@0: % dempot - Computes the negative log likelihood for a mixture model. Daniel@0: % demprgp - Demonstrate sampling from a Gaussian Process prior. Daniel@0: % demprior - Demonstrate sampling from a multi-parameter Gaussian prior. Daniel@0: % demrbf1 - Demonstrate simple regression using a radial basis function network. Daniel@0: % demsom1 - Demonstrate SOM for visualisation. Daniel@0: % demtrain - Demonstrate training of MLP network. Daniel@0: % dist2 - Calculates squared distance between two sets of points. Daniel@0: % eigdec - Sorted eigendecomposition Daniel@0: % errbayes - Evaluate Bayesian error function for network. Daniel@0: % evidence - Re-estimate hyperparameters using evidence approximation. Daniel@0: % fevbayes - Evaluate Bayesian regularisation for network forward propagation. Daniel@0: % gauss - Evaluate a Gaussian distribution. Daniel@0: % gbayes - Evaluate gradient of Bayesian error function for network. Daniel@0: % glm - Create a generalized linear model. Daniel@0: % glmderiv - Evaluate derivatives of GLM outputs with respect to weights. Daniel@0: % glmerr - Evaluate error function for generalized linear model. Daniel@0: % glmevfwd - Forward propagation with evidence for GLM Daniel@0: % glmfwd - Forward propagation through generalized linear model. Daniel@0: % glmgrad - Evaluate gradient of error function for generalized linear model. Daniel@0: % glmhess - Evaluate the Hessian matrix for a generalised linear model. Daniel@0: % glminit - Initialise the weights in a generalized linear model. Daniel@0: % glmpak - Combines weights and biases into one weights vector. Daniel@0: % glmtrain - Specialised training of generalized linear model Daniel@0: % glmunpak - Separates weights vector into weight and bias matrices. Daniel@0: % gmm - Creates a Gaussian mixture model with specified architecture. Daniel@0: % gmmactiv - Computes the activations of a Gaussian mixture model. Daniel@0: % gmmem - EM algorithm for Gaussian mixture model. Daniel@0: % gmminit - Initialises Gaussian mixture model from data Daniel@0: % gmmpak - Combines all the parameters in a Gaussian mixture model into one vector. Daniel@0: % gmmpost - Computes the class posterior probabilities of a Gaussian mixture model. Daniel@0: % gmmprob - Computes the data probability for a Gaussian mixture model. Daniel@0: % gmmsamp - Sample from a Gaussian mixture distribution. Daniel@0: % gmmunpak - Separates a vector of Gaussian mixture model parameters into its components. Daniel@0: % gp - Create a Gaussian Process. Daniel@0: % gpcovar - Calculate the covariance for a Gaussian Process. Daniel@0: % gpcovarf - Calculate the covariance function for a Gaussian Process. Daniel@0: % gpcovarp - Calculate the prior covariance for a Gaussian Process. Daniel@0: % gperr - Evaluate error function for Gaussian Process. Daniel@0: % gpfwd - Forward propagation through Gaussian Process. Daniel@0: % gpgrad - Evaluate error gradient for Gaussian Process. Daniel@0: % gpinit - Initialise Gaussian Process model. Daniel@0: % gppak - Combines GP hyperparameters into one vector. Daniel@0: % gpunpak - Separates hyperparameter vector into components. Daniel@0: % gradchek - Checks a user-defined gradient function using finite differences. Daniel@0: % graddesc - Gradient descent optimization. Daniel@0: % gsamp - Sample from a Gaussian distribution. Daniel@0: % gtm - Create a Generative Topographic Map. Daniel@0: % gtmem - EM algorithm for Generative Topographic Mapping. Daniel@0: % gtmfwd - Forward propagation through GTM. Daniel@0: % gtminit - Initialise the weights and latent sample in a GTM. Daniel@0: % gtmlmean - Mean responsibility for data in a GTM. Daniel@0: % gtmlmode - Mode responsibility for data in a GTM. Daniel@0: % gtmmag - Magnification factors for a GTM Daniel@0: % gtmpost - Latent space responsibility for data in a GTM. Daniel@0: % gtmprob - Probability for data under a GTM. Daniel@0: % hbayes - Evaluate Hessian of Bayesian error function for network. Daniel@0: % hesschek - Use central differences to confirm correct evaluation of Hessian matrix. Daniel@0: % hintmat - Evaluates the coordinates of the patches for a Hinton diagram. Daniel@0: % hinton - Plot Hinton diagram for a weight matrix. Daniel@0: % histp - Histogram estimate of 1-dimensional probability distribution. Daniel@0: % hmc - Hybrid Monte Carlo sampling. Daniel@0: % kmeans - Trains a k means cluster model. Daniel@0: % knn - Creates a K-nearest-neighbour classifier. Daniel@0: % knnfwd - Forward propagation through a K-nearest-neighbour classifier. Daniel@0: % linef - Calculate function value along a line. Daniel@0: % linemin - One dimensional minimization. Daniel@0: % maxitmess- Create a standard error message when training reaches max. iterations. Daniel@0: % mdn - Creates a Mixture Density Network with specified architecture. Daniel@0: % mdn2gmm - Converts an MDN mixture data structure to array of GMMs. Daniel@0: % mdndist2 - Calculates squared distance between centres of Gaussian kernels and data Daniel@0: % mdnerr - Evaluate error function for Mixture Density Network. Daniel@0: % mdnfwd - Forward propagation through Mixture Density Network. Daniel@0: % mdngrad - Evaluate gradient of error function for Mixture Density Network. Daniel@0: % mdninit - Initialise the weights in a Mixture Density Network. Daniel@0: % mdnpak - Combines weights and biases into one weights vector. Daniel@0: % mdnpost - Computes the posterior probability for each MDN mixture component. Daniel@0: % mdnprob - Computes the data probability likelihood for an MDN mixture structure. Daniel@0: % mdnunpak - Separates weights vector into weight and bias matrices. Daniel@0: % metrop - Markov Chain Monte Carlo sampling with Metropolis algorithm. Daniel@0: % minbrack - Bracket a minimum of a function of one variable. Daniel@0: % mlp - Create a 2-layer feedforward network. Daniel@0: % mlpbkp - Backpropagate gradient of error function for 2-layer network. Daniel@0: % mlpderiv - Evaluate derivatives of network outputs with respect to weights. Daniel@0: % mlperr - Evaluate error function for 2-layer network. Daniel@0: % mlpevfwd - Forward propagation with evidence for MLP Daniel@0: % mlpfwd - Forward propagation through 2-layer network. Daniel@0: % mlpgrad - Evaluate gradient of error function for 2-layer network. Daniel@0: % mlphdotv - Evaluate the product of the data Hessian with a vector. Daniel@0: % mlphess - Evaluate the Hessian matrix for a multi-layer perceptron network. Daniel@0: % mlphint - Plot Hinton diagram for 2-layer feed-forward network. Daniel@0: % mlpinit - Initialise the weights in a 2-layer feedforward network. Daniel@0: % mlppak - Combines weights and biases into one weights vector. Daniel@0: % mlpprior - Create Gaussian prior for mlp. Daniel@0: % mlptrain - Utility to train an MLP network for demtrain Daniel@0: % mlpunpak - Separates weights vector into weight and bias matrices. Daniel@0: % netderiv - Evaluate derivatives of network outputs by weights generically. Daniel@0: % neterr - Evaluate network error function for generic optimizers Daniel@0: % netevfwd - Generic forward propagation with evidence for network Daniel@0: % netgrad - Evaluate network error gradient for generic optimizers Daniel@0: % nethess - Evaluate network Hessian Daniel@0: % netinit - Initialise the weights in a network. Daniel@0: % netopt - Optimize the weights in a network model. Daniel@0: % netpak - Combines weights and biases into one weights vector. Daniel@0: % netunpak - Separates weights vector into weight and bias matrices. Daniel@0: % olgd - On-line gradient descent optimization. Daniel@0: % pca - Principal Components Analysis Daniel@0: % plotmat - Display a matrix. Daniel@0: % ppca - Probabilistic Principal Components Analysis Daniel@0: % quasinew - Quasi-Newton optimization. Daniel@0: % rbf - Creates an RBF network with specified architecture Daniel@0: % rbfbkp - Backpropagate gradient of error function for RBF network. Daniel@0: % rbfderiv - Evaluate derivatives of RBF network outputs with respect to weights. Daniel@0: % rbferr - Evaluate error function for RBF network. Daniel@0: % rbfevfwd - Forward propagation with evidence for RBF Daniel@0: % rbffwd - Forward propagation through RBF network with linear outputs. Daniel@0: % rbfgrad - Evaluate gradient of error function for RBF network. Daniel@0: % rbfhess - Evaluate the Hessian matrix for RBF network. Daniel@0: % rbfjacob - Evaluate derivatives of RBF network outputs with respect to inputs. Daniel@0: % rbfpak - Combines all the parameters in an RBF network into one weights vector. Daniel@0: % rbfprior - Create Gaussian prior and output layer mask for RBF. Daniel@0: % rbfsetbf - Set basis functions of RBF from data. Daniel@0: % rbfsetfw - Set basis function widths of RBF. Daniel@0: % rbftrain - Two stage training of RBF network. Daniel@0: % rbfunpak - Separates a vector of RBF weights into its components. Daniel@0: % rosegrad - Calculate gradient of Rosenbrock's function. Daniel@0: % rosen - Calculate Rosenbrock's function. Daniel@0: % scg - Scaled conjugate gradient optimization. Daniel@0: % som - Creates a Self-Organising Map. Daniel@0: % somfwd - Forward propagation through a Self-Organising Map. Daniel@0: % sompak - Combines node weights into one weights matrix. Daniel@0: % somtrain - Kohonen training algorithm for SOM. Daniel@0: % somunpak - Replaces node weights in SOM. Daniel@0: % Daniel@0: % Copyright (c) Ian T Nabney (1996-2001) Daniel@0: %