Daniel@0: Daniel@0: Daniel@0: Daniel@0: NETLAB Reference Documentation Daniel@0: Daniel@0: Daniel@0: Daniel@0:

NETLAB Online Reference Documentation

Daniel@0: Welcome to the NETLAB online reference documentation. Daniel@0: The NETLAB simulation software is designed to provide all the tools necessary Daniel@0: for principled and theoretically well founded application development. The Daniel@0: NETLAB library is based on the approach and techniques described in Neural Daniel@0: Networks for Pattern Recognition (Bishop, 1995). The library includes software Daniel@0: implementations of a wide range of data analysis techniques, many of which are Daniel@0: not widely available, and are rarely, if ever, included in standard neural Daniel@0: network simulation packages. Daniel@0:

The online reference documentation provides direct hypertext links to specific Netlab function descriptions. Daniel@0:

If you have any comments or problems to report, please contact Ian Nabney (i.t.nabney@aston.ac.uk) or Christopher Bishop (c.m.bishop@aston.ac.uk).

Index Daniel@0:

Daniel@0: An alphabetic list of functions in Netlab.

Daniel@0:

Daniel@0:
Daniel@0: conffig
Daniel@0: Display a confusion matrix. Daniel@0:
Daniel@0: confmat
Daniel@0: Compute a confusion matrix. Daniel@0:
Daniel@0: conjgrad
Daniel@0: Conjugate gradients optimization. Daniel@0:
Daniel@0: consist
Daniel@0: Check that arguments are consistent. Daniel@0:
Daniel@0: convertoldnet
Daniel@0: Convert pre-2.3 release MLP and MDN nets to new format Daniel@0:
Daniel@0: datread
Daniel@0: Read data from an ascii file. Daniel@0:
Daniel@0: datwrite
Daniel@0: Write data to ascii file. Daniel@0:
Daniel@0: dem2ddat
Daniel@0: Generates two dimensional data for demos. Daniel@0:
Daniel@0: demard
Daniel@0: Automatic relevance determination using the MLP. Daniel@0:
Daniel@0: demev1
Daniel@0: Demonstrate Bayesian regression for the MLP. Daniel@0:
Daniel@0: demev2
Daniel@0: Demonstrate Bayesian classification for the MLP. Daniel@0:
Daniel@0: demev3
Daniel@0: Demonstrate Bayesian regression for the RBF. Daniel@0:
Daniel@0: demgauss
Daniel@0: Demonstrate sampling from Gaussian distributions. Daniel@0:
Daniel@0: demglm1
Daniel@0: Demonstrate simple classification using a generalized linear model. Daniel@0:
Daniel@0: demglm2
Daniel@0: Demonstrate simple classification using a generalized linear model. Daniel@0:
Daniel@0: demgmm1
Daniel@0: Demonstrate density modelling with a Gaussian mixture model. Daniel@0:
Daniel@0: demgmm3
Daniel@0: Demonstrate density modelling with a Gaussian mixture model. Daniel@0:
Daniel@0: demgmm4
Daniel@0: Demonstrate density modelling with a Gaussian mixture model. Daniel@0:
Daniel@0: demgmm5
Daniel@0: Demonstrate density modelling with a PPCA mixture model. Daniel@0:
Daniel@0: demgp
Daniel@0: Demonstrate simple regression using a Gaussian Process. Daniel@0:
Daniel@0: demgpard
Daniel@0: Demonstrate ARD using a Gaussian Process. Daniel@0:
Daniel@0: demgpot
Daniel@0: Computes the gradient of the negative log likelihood for a mixture model. Daniel@0:
Daniel@0: demgtm1
Daniel@0: Demonstrate EM for GTM. Daniel@0:
Daniel@0: demgtm2
Daniel@0: Demonstrate GTM for visualisation. Daniel@0:
Daniel@0: demhint
Daniel@0: Demonstration of Hinton diagram for 2-layer feed-forward network. Daniel@0:
Daniel@0: demhmc1
Daniel@0: Demonstrate Hybrid Monte Carlo sampling on mixture of two Gaussians. Daniel@0:
Daniel@0: demhmc2
Daniel@0: Demonstrate Bayesian regression with Hybrid Monte Carlo sampling. Daniel@0:
Daniel@0: demhmc3
Daniel@0: Demonstrate Bayesian regression with Hybrid Monte Carlo sampling. Daniel@0:
Daniel@0: demkmean
Daniel@0: Demonstrate simple clustering model trained with K-means. Daniel@0:
Daniel@0: demknn1
Daniel@0: Demonstrate nearest neighbour classifier. Daniel@0:
Daniel@0: demmdn1
Daniel@0: Demonstrate fitting a multi-valued function using a Mixture Density Network. Daniel@0:
Daniel@0: demmet1
Daniel@0: Demonstrate Markov Chain Monte Carlo sampling on a Gaussian. Daniel@0:
Daniel@0: demmlp1
Daniel@0: Demonstrate simple regression using a multi-layer perceptron Daniel@0:
Daniel@0: demmlp2
Daniel@0: Demonstrate simple classification using a multi-layer perceptron Daniel@0:
Daniel@0: demnlab
Daniel@0: A front-end Graphical User Interface to the demos Daniel@0:
Daniel@0: demns1
Daniel@0: Demonstrate Neuroscale for visualisation. Daniel@0:
Daniel@0: demolgd1
Daniel@0: Demonstrate simple MLP optimisation with on-line gradient descent Daniel@0:
Daniel@0: demopt1
Daniel@0: Demonstrate different optimisers on Rosenbrock's function. Daniel@0:
Daniel@0: dempot
Daniel@0: Computes the negative log likelihood for a mixture model. Daniel@0:
Daniel@0: demprgp
Daniel@0: Demonstrate sampling from a Gaussian Process prior. Daniel@0:
Daniel@0: demprior
Daniel@0: Demonstrate sampling from a multi-parameter Gaussian prior. Daniel@0:
Daniel@0: demrbf1
Daniel@0: Demonstrate simple regression using a radial basis function network. Daniel@0:
Daniel@0: demsom1
Daniel@0: Demonstrate SOM for visualisation. Daniel@0:
Daniel@0: demtrain
Daniel@0: Demonstrate training of MLP network. Daniel@0:
Daniel@0: dist2
Daniel@0: Calculates squared distance between two sets of points. Daniel@0:
Daniel@0: eigdec
Daniel@0: Sorted eigendecomposition Daniel@0:
Daniel@0: errbayes
Daniel@0: Evaluate Bayesian error function for network. Daniel@0:
Daniel@0: evidence
Daniel@0: Re-estimate hyperparameters using evidence approximation. Daniel@0:
Daniel@0: fevbayes
Daniel@0: Evaluate Bayesian regularisation for network forward propagation. Daniel@0:
Daniel@0: gauss
Daniel@0: Evaluate a Gaussian distribution. Daniel@0:
Daniel@0: gbayes
Daniel@0: Evaluate gradient of Bayesian error function for network. Daniel@0:
Daniel@0: glm
Daniel@0: Create a generalized linear model. Daniel@0:
Daniel@0: glmderiv
Daniel@0: Evaluate derivatives of GLM outputs with respect to weights. Daniel@0:
Daniel@0: glmerr
Daniel@0: Evaluate error function for generalized linear model. Daniel@0:
Daniel@0: glmevfwd
Daniel@0: Forward propagation with evidence for GLM Daniel@0:
Daniel@0: glmfwd
Daniel@0: Forward propagation through generalized linear model. Daniel@0:
Daniel@0: glmgrad
Daniel@0: Evaluate gradient of error function for generalized linear model. Daniel@0:
Daniel@0: glmhess
Daniel@0: Evaluate the Hessian matrix for a generalised linear model. Daniel@0:
Daniel@0: glminit
Daniel@0: Initialise the weights in a generalized linear model. Daniel@0:
Daniel@0: glmpak
Daniel@0: Combines weights and biases into one weights vector. Daniel@0:
Daniel@0: glmtrain
Daniel@0: Specialised training of generalized linear model Daniel@0:
Daniel@0: glmunpak
Daniel@0: Separates weights vector into weight and bias matrices. Daniel@0:
Daniel@0: gmm
Daniel@0: Creates a Gaussian mixture model with specified architecture. Daniel@0:
Daniel@0: gmmactiv
Daniel@0: Computes the activations of a Gaussian mixture model. Daniel@0:
Daniel@0: gmmem
Daniel@0: EM algorithm for Gaussian mixture model. Daniel@0:
Daniel@0: gmminit
Daniel@0: Initialises Gaussian mixture model from data Daniel@0:
Daniel@0: gmmpak
Daniel@0: Combines all the parameters in a Gaussian mixture model into one vector. Daniel@0:
Daniel@0: gmmpost
Daniel@0: Computes the class posterior probabilities of a Gaussian mixture model. Daniel@0:
Daniel@0: gmmprob
Daniel@0: Computes the data probability for a Gaussian mixture model. Daniel@0:
Daniel@0: gmmsamp
Daniel@0: Sample from a Gaussian mixture distribution. Daniel@0:
Daniel@0: gmmunpak
Daniel@0: Separates a vector of Gaussian mixture model parameters into its components. Daniel@0:
Daniel@0: gp
Daniel@0: Create a Gaussian Process. Daniel@0:
Daniel@0: gpcovar
Daniel@0: Calculate the covariance for a Gaussian Process. Daniel@0:
Daniel@0: gpcovarf
Daniel@0: Calculate the covariance function for a Gaussian Process. Daniel@0:
Daniel@0: gpcovarp
Daniel@0: Calculate the prior covariance for a Gaussian Process. Daniel@0:
Daniel@0: gperr
Daniel@0: Evaluate error function for Gaussian Process. Daniel@0:
Daniel@0: gpfwd
Daniel@0: Forward propagation through Gaussian Process. Daniel@0:
Daniel@0: gpgrad
Daniel@0: Evaluate error gradient for Gaussian Process. Daniel@0:
Daniel@0: gpinit
Daniel@0: Initialise Gaussian Process model. Daniel@0:
Daniel@0: gppak
Daniel@0: Combines GP hyperparameters into one vector. Daniel@0:
Daniel@0: gpunpak
Daniel@0: Separates hyperparameter vector into components. Daniel@0:
Daniel@0: gradchek
Daniel@0: Checks a user-defined gradient function using finite differences. Daniel@0:
Daniel@0: graddesc
Daniel@0: Gradient descent optimization. Daniel@0:
Daniel@0: gsamp
Daniel@0: Sample from a Gaussian distribution. Daniel@0:
Daniel@0: gtm
Daniel@0: Create a Generative Topographic Map. Daniel@0:
Daniel@0: gtmem
Daniel@0: EM algorithm for Generative Topographic Mapping. Daniel@0:
Daniel@0: gtmfwd
Daniel@0: Forward propagation through GTM. Daniel@0:
Daniel@0: gtminit
Daniel@0: Initialise the weights and latent sample in a GTM. Daniel@0:
Daniel@0: gtmlmean
Daniel@0: Mean responsibility for data in a GTM. Daniel@0:
Daniel@0: gtmlmode
Daniel@0: Mode responsibility for data in a GTM. Daniel@0:
Daniel@0: gtmmag
Daniel@0: Magnification factors for a GTM Daniel@0:
Daniel@0: gtmpost
Daniel@0: Latent space responsibility for data in a GTM. Daniel@0:
Daniel@0: gtmprob
Daniel@0: Probability for data under a GTM. Daniel@0:
Daniel@0: hbayes
Daniel@0: Evaluate Hessian of Bayesian error function for network. Daniel@0:
Daniel@0: hesschek
Daniel@0: Use central differences to confirm correct evaluation of Hessian matrix. Daniel@0:
Daniel@0: hintmat
Daniel@0: Evaluates the coordinates of the patches for a Hinton diagram. Daniel@0:
Daniel@0: hinton
Daniel@0: Plot Hinton diagram for a weight matrix. Daniel@0:
Daniel@0: histp
Daniel@0: Histogram estimate of 1-dimensional probability distribution. Daniel@0:
Daniel@0: hmc
Daniel@0: Hybrid Monte Carlo sampling. Daniel@0:
Daniel@0: kmeans
Daniel@0: Trains a k means cluster model. Daniel@0:
Daniel@0: knn
Daniel@0: Creates a K-nearest-neighbour classifier. Daniel@0:
Daniel@0: knnfwd
Daniel@0: Forward propagation through a K-nearest-neighbour classifier. Daniel@0:
Daniel@0: linef
Daniel@0: Calculate function value along a line. Daniel@0:
Daniel@0: linemin
Daniel@0: One dimensional minimization. Daniel@0:
Daniel@0: maxitmess
Daniel@0: Create a standard error message when training reaches max. iterations. Daniel@0:
Daniel@0: mdn
Daniel@0: Creates a Mixture Density Network with specified architecture. Daniel@0:
Daniel@0: mdn2gmm
Daniel@0: Converts an MDN mixture data structure to array of GMMs. Daniel@0:
Daniel@0: mdndist2
Daniel@0: Calculates squared distance between centres of Gaussian kernels and data Daniel@0:
Daniel@0: mdnerr
Daniel@0: Evaluate error function for Mixture Density Network. Daniel@0:
Daniel@0: mdnfwd
Daniel@0: Forward propagation through Mixture Density Network. Daniel@0:
Daniel@0: mdngrad
Daniel@0: Evaluate gradient of error function for Mixture Density Network. Daniel@0:
Daniel@0: mdninit
Daniel@0: Initialise the weights in a Mixture Density Network. Daniel@0:
Daniel@0: mdnpak
Daniel@0: Combines weights and biases into one weights vector. Daniel@0:
Daniel@0: mdnpost
Daniel@0: Computes the posterior probability for each MDN mixture component. Daniel@0:
Daniel@0: mdnprob
Daniel@0: Computes the data probability likelihood for an MDN mixture structure. Daniel@0:
Daniel@0: mdnunpak
Daniel@0: Separates weights vector into weight and bias matrices. Daniel@0:
Daniel@0: metrop
Daniel@0: Markov Chain Monte Carlo sampling with Metropolis algorithm. Daniel@0:
Daniel@0: minbrack
Daniel@0: Bracket a minimum of a function of one variable. Daniel@0:
Daniel@0: mlp
Daniel@0: Create a 2-layer feedforward network. Daniel@0:
Daniel@0: mlpbkp
Daniel@0: Backpropagate gradient of error function for 2-layer network. Daniel@0:
Daniel@0: mlpderiv
Daniel@0: Evaluate derivatives of network outputs with respect to weights. Daniel@0:
Daniel@0: mlperr
Daniel@0: Evaluate error function for 2-layer network. Daniel@0:
Daniel@0: mlpevfwd
Daniel@0: Forward propagation with evidence for MLP Daniel@0:
Daniel@0: mlpfwd
Daniel@0: Forward propagation through 2-layer network. Daniel@0:
Daniel@0: mlpgrad
Daniel@0: Evaluate gradient of error function for 2-layer network. Daniel@0:
Daniel@0: mlphdotv
Daniel@0: Evaluate the product of the data Hessian with a vector. Daniel@0:
Daniel@0: mlphess
Daniel@0: Evaluate the Hessian matrix for a multi-layer perceptron network. Daniel@0:
Daniel@0: mlphint
Daniel@0: Plot Hinton diagram for 2-layer feed-forward network. Daniel@0:
Daniel@0: mlpinit
Daniel@0: Initialise the weights in a 2-layer feedforward network. Daniel@0:
Daniel@0: mlppak
Daniel@0: Combines weights and biases into one weights vector. Daniel@0:
Daniel@0: mlpprior
Daniel@0: Create Gaussian prior for mlp. Daniel@0:
Daniel@0: mlptrain
Daniel@0: Utility to train an MLP network for demtrain Daniel@0:
Daniel@0: mlpunpak
Daniel@0: Separates weights vector into weight and bias matrices. Daniel@0:
Daniel@0: netderiv
Daniel@0: Evaluate derivatives of network outputs by weights generically. Daniel@0:
Daniel@0: neterr
Daniel@0: Evaluate network error function for generic optimizers Daniel@0:
Daniel@0: netevfwd
Daniel@0: Generic forward propagation with evidence for network Daniel@0:
Daniel@0: netgrad
Daniel@0: Evaluate network error gradient for generic optimizers Daniel@0:
Daniel@0: nethess
Daniel@0: Evaluate network Hessian Daniel@0:
Daniel@0: netinit
Daniel@0: Initialise the weights in a network. Daniel@0:
Daniel@0: netopt
Daniel@0: Optimize the weights in a network model. Daniel@0:
Daniel@0: netpak
Daniel@0: Combines weights and biases into one weights vector. Daniel@0:
Daniel@0: netunpak
Daniel@0: Separates weights vector into weight and bias matrices. Daniel@0:
Daniel@0: olgd
Daniel@0: On-line gradient descent optimization. Daniel@0:
Daniel@0: pca
Daniel@0: Principal Components Analysis Daniel@0:
Daniel@0: plotmat
Daniel@0: Display a matrix. Daniel@0:
Daniel@0: ppca
Daniel@0: Probabilistic Principal Components Analysis Daniel@0:
Daniel@0: quasinew
Daniel@0: Quasi-Newton optimization. Daniel@0:
Daniel@0: rbf
Daniel@0: Creates an RBF network with specified architecture Daniel@0:
Daniel@0: rbfbkp
Daniel@0: Backpropagate gradient of error function for RBF network. Daniel@0:
Daniel@0: rbfderiv
Daniel@0: Evaluate derivatives of RBF network outputs with respect to weights. Daniel@0:
Daniel@0: rbferr
Daniel@0: Evaluate error function for RBF network. Daniel@0:
Daniel@0: rbfevfwd
Daniel@0: Forward propagation with evidence for RBF Daniel@0:
Daniel@0: rbffwd
Daniel@0: Forward propagation through RBF network with linear outputs. Daniel@0:
Daniel@0: rbfgrad
Daniel@0: Evaluate gradient of error function for RBF network. Daniel@0:
Daniel@0: rbfhess
Daniel@0: Evaluate the Hessian matrix for RBF network. Daniel@0:
Daniel@0: rbfjacob
Daniel@0: Evaluate derivatives of RBF network outputs with respect to inputs. Daniel@0:
Daniel@0: rbfpak
Daniel@0: Combines all the parameters in an RBF network into one weights vector. Daniel@0:
Daniel@0: rbfprior
Daniel@0: Create Gaussian prior and output layer mask for RBF. Daniel@0:
Daniel@0: rbfsetbf
Daniel@0: Set basis functions of RBF from data. Daniel@0:
Daniel@0: rbfsetfw
Daniel@0: Set basis function widths of RBF. Daniel@0:
Daniel@0: rbftrain
Daniel@0: Two stage training of RBF network. Daniel@0:
Daniel@0: rbfunpak
Daniel@0: Separates a vector of RBF weights into its components. Daniel@0:
Daniel@0: rosegrad
Daniel@0: Calculate gradient of Rosenbrock's function. Daniel@0:
Daniel@0: rosen
Daniel@0: Calculate Rosenbrock's function. Daniel@0:
Daniel@0: scg
Daniel@0: Scaled conjugate gradient optimization. Daniel@0:
Daniel@0: som
Daniel@0: Creates a Self-Organising Map. Daniel@0:
Daniel@0: somfwd
Daniel@0: Forward propagation through a Self-Organising Map. Daniel@0:
Daniel@0: sompak
Daniel@0: Combines node weights into one weights matrix. Daniel@0:
Daniel@0: somtrain
Daniel@0: Kohonen training algorithm for SOM. Daniel@0:
Daniel@0: somunpak
Daniel@0: Replaces node weights in SOM. Daniel@0:
Daniel@0: Daniel@0:
Daniel@0:

Copyright (c) Christopher M Bishop, Ian T Nabney (1996, 1997) Daniel@0: Daniel@0: