view toolboxes/FullBNT-1.0.7/nethelp3.3/index.htm @ 0:e9a9cd732c1e tip

first hg version after svn
author wolffd
date Tue, 10 Feb 2015 15:05:51 +0000
parents
children
line wrap: on
line source
<html>
<head>
<title>
NETLAB Reference Documentation 
</title>
</head>
<body>
<H1> NETLAB Online Reference Documentation </H1>
Welcome to the NETLAB online reference documentation.
The NETLAB simulation software is designed to provide all the tools necessary
for principled and theoretically well founded application development. The
NETLAB library is based on the approach and techniques described in <I>Neural
Networks for Pattern Recognition </I>(Bishop, 1995). The library includes software
implementations of a wide range of data analysis techniques, many of which are
not widely available, and are rarely, if ever, included in standard neural
network simulation packages.
<p>The online reference documentation provides direct hypertext links to specific Netlab function descriptions.
<p>If you have any comments or problems to report, please contact Ian Nabney (<a href="mailto:i.t.nabney@aston.ac.uk"><tt>i.t.nabney@aston.ac.uk</tt></a>) or Christopher Bishop (<a href="mailto:c.m.bishop@aston.ac.uk"><tt>c.m.bishop@aston.ac.uk</tt></a>).<H1> Index
</H1>
An alphabetic list of functions in Netlab.<p>
<DL>
<DT>
<CODE><a href="conffig.htm">conffig</a></CODE><DD>
 Display a confusion matrix. 
<DT>
<CODE><a href="confmat.htm">confmat</a></CODE><DD>
 Compute a confusion matrix. 
<DT>
<CODE><a href="conjgrad.htm">conjgrad</a></CODE><DD>
 Conjugate gradients optimization. 
<DT>
<CODE><a href="consist.htm">consist</a></CODE><DD>
 Check that arguments are consistent. 
<DT>
<CODE><a href="convertoldnet.htm">convertoldnet</a></CODE><DD>
 Convert pre-2.3 release MLP and MDN nets to new format 
<DT>
<CODE><a href="datread.htm">datread</a></CODE><DD>
 Read data from an ascii file. 
<DT>
<CODE><a href="datwrite.htm">datwrite</a></CODE><DD>
 Write data to ascii file. 
<DT>
<CODE><a href="dem2ddat.htm">dem2ddat</a></CODE><DD>
 Generates two dimensional data for demos. 
<DT>
<CODE><a href="demard.htm">demard</a></CODE><DD>
 Automatic relevance determination using the MLP. 
<DT>
<CODE><a href="demev1.htm">demev1</a></CODE><DD>
 Demonstrate Bayesian regression for the MLP. 
<DT>
<CODE><a href="demev2.htm">demev2</a></CODE><DD>
 Demonstrate Bayesian classification for the MLP. 
<DT>
<CODE><a href="demev3.htm">demev3</a></CODE><DD>
 Demonstrate Bayesian regression for the RBF. 
<DT>
<CODE><a href="demgauss.htm">demgauss</a></CODE><DD>
 Demonstrate sampling from Gaussian distributions. 
<DT>
<CODE><a href="demglm1.htm">demglm1</a></CODE><DD>
 Demonstrate simple classification using a generalized linear model. 
<DT>
<CODE><a href="demglm2.htm">demglm2</a></CODE><DD>
 Demonstrate simple classification using a generalized linear model. 
<DT>
<CODE><a href="demgmm1.htm">demgmm1</a></CODE><DD>
 Demonstrate density modelling with a Gaussian mixture model. 
<DT>
<CODE><a href="demgmm3.htm">demgmm3</a></CODE><DD>
 Demonstrate density modelling with a Gaussian mixture model. 
<DT>
<CODE><a href="demgmm4.htm">demgmm4</a></CODE><DD>
 Demonstrate density modelling with a Gaussian mixture model. 
<DT>
<CODE><a href="demgmm5.htm">demgmm5</a></CODE><DD>
 Demonstrate density modelling with a PPCA mixture model. 
<DT>
<CODE><a href="demgp.htm">demgp</a></CODE><DD>
 Demonstrate simple regression using a Gaussian Process. 
<DT>
<CODE><a href="demgpard.htm">demgpard</a></CODE><DD>
 Demonstrate ARD using a Gaussian Process. 
<DT>
<CODE><a href="demgpot.htm">demgpot</a></CODE><DD>
 Computes the gradient of the negative log likelihood for a mixture model. 
<DT>
<CODE><a href="demgtm1.htm">demgtm1</a></CODE><DD>
 Demonstrate EM for GTM. 
<DT>
<CODE><a href="demgtm2.htm">demgtm2</a></CODE><DD>
 Demonstrate GTM for visualisation. 
<DT>
<CODE><a href="demhint.htm">demhint</a></CODE><DD>
 Demonstration of Hinton diagram for 2-layer feed-forward network. 
<DT>
<CODE><a href="demhmc1.htm">demhmc1</a></CODE><DD>
 Demonstrate Hybrid Monte Carlo sampling on mixture of two Gaussians. 
<DT>
<CODE><a href="demhmc2.htm">demhmc2</a></CODE><DD>
 Demonstrate Bayesian regression with Hybrid Monte Carlo sampling. 
<DT>
<CODE><a href="demhmc3.htm">demhmc3</a></CODE><DD>
 Demonstrate Bayesian regression with Hybrid Monte Carlo sampling. 
<DT>
<CODE><a href="demkmean.htm">demkmean</a></CODE><DD>
 Demonstrate simple clustering model trained with K-means. 
<DT>
<CODE><a href="demknn1.htm">demknn1</a></CODE><DD>
 Demonstrate nearest neighbour classifier. 
<DT>
<CODE><a href="demmdn1.htm">demmdn1</a></CODE><DD>
 Demonstrate fitting a multi-valued function using a Mixture Density Network. 
<DT>
<CODE><a href="demmet1.htm">demmet1</a></CODE><DD>
 Demonstrate Markov Chain Monte Carlo sampling on a Gaussian. 
<DT>
<CODE><a href="demmlp1.htm">demmlp1</a></CODE><DD>
 Demonstrate simple regression using a multi-layer perceptron 
<DT>
<CODE><a href="demmlp2.htm">demmlp2</a></CODE><DD>
 Demonstrate simple classification using a multi-layer perceptron 
<DT>
<CODE><a href="demnlab.htm">demnlab</a></CODE><DD>
 A front-end Graphical User Interface to the demos 
<DT>
<CODE><a href="demns1.htm">demns1</a></CODE><DD>
 Demonstrate Neuroscale for visualisation. 
<DT>
<CODE><a href="demolgd1.htm">demolgd1</a></CODE><DD>
 Demonstrate simple MLP optimisation with on-line gradient descent 
<DT>
<CODE><a href="demopt1.htm">demopt1</a></CODE><DD>
 Demonstrate different optimisers on Rosenbrock's function. 
<DT>
<CODE><a href="dempot.htm">dempot</a></CODE><DD>
 Computes the negative log likelihood for a mixture model. 
<DT>
<CODE><a href="demprgp.htm">demprgp</a></CODE><DD>
 Demonstrate sampling from a Gaussian Process prior. 
<DT>
<CODE><a href="demprior.htm">demprior</a></CODE><DD>
 Demonstrate sampling from a multi-parameter Gaussian prior. 
<DT>
<CODE><a href="demrbf1.htm">demrbf1</a></CODE><DD>
 Demonstrate simple regression using a radial basis function network. 
<DT>
<CODE><a href="demsom1.htm">demsom1</a></CODE><DD>
 Demonstrate SOM for visualisation. 
<DT>
<CODE><a href="demtrain.htm">demtrain</a></CODE><DD>
 Demonstrate training of MLP network. 
<DT>
<CODE><a href="dist2.htm">dist2</a></CODE><DD>
 Calculates squared distance between two sets of points. 
<DT>
<CODE><a href="eigdec.htm">eigdec</a></CODE><DD>
 Sorted eigendecomposition 
<DT>
<CODE><a href="errbayes.htm">errbayes</a></CODE><DD>
 Evaluate Bayesian error function for network. 
<DT>
<CODE><a href="evidence.htm">evidence</a></CODE><DD>
 Re-estimate hyperparameters using evidence approximation. 
<DT>
<CODE><a href="fevbayes.htm">fevbayes</a></CODE><DD>
 Evaluate Bayesian regularisation for network forward propagation. 
<DT>
<CODE><a href="gauss.htm">gauss</a></CODE><DD>
 Evaluate a Gaussian distribution. 
<DT>
<CODE><a href="gbayes.htm">gbayes</a></CODE><DD>
 Evaluate gradient of Bayesian error function for network. 
<DT>
<CODE><a href="glm.htm">glm</a></CODE><DD>
 Create a generalized linear model. 
<DT>
<CODE><a href="glmderiv.htm">glmderiv</a></CODE><DD>
 Evaluate derivatives of GLM outputs with respect to weights. 
<DT>
<CODE><a href="glmerr.htm">glmerr</a></CODE><DD>
 Evaluate error function for generalized linear model. 
<DT>
<CODE><a href="glmevfwd.htm">glmevfwd</a></CODE><DD>
 Forward propagation with evidence for GLM 
<DT>
<CODE><a href="glmfwd.htm">glmfwd</a></CODE><DD>
 Forward propagation through generalized linear model. 
<DT>
<CODE><a href="glmgrad.htm">glmgrad</a></CODE><DD>
 Evaluate gradient of error function for generalized linear model. 
<DT>
<CODE><a href="glmhess.htm">glmhess</a></CODE><DD>
 Evaluate the Hessian matrix for a generalised linear model. 
<DT>
<CODE><a href="glminit.htm">glminit</a></CODE><DD>
 Initialise the weights in a generalized linear model. 
<DT>
<CODE><a href="glmpak.htm">glmpak</a></CODE><DD>
 Combines weights and biases into one weights vector. 
<DT>
<CODE><a href="glmtrain.htm">glmtrain</a></CODE><DD>
 Specialised training of generalized linear model 
<DT>
<CODE><a href="glmunpak.htm">glmunpak</a></CODE><DD>
 Separates weights vector into weight and bias matrices. 
<DT>
<CODE><a href="gmm.htm">gmm</a></CODE><DD>
 Creates a Gaussian mixture model with specified architecture. 
<DT>
<CODE><a href="gmmactiv.htm">gmmactiv</a></CODE><DD>
 Computes the activations of a Gaussian mixture model. 
<DT>
<CODE><a href="gmmem.htm">gmmem</a></CODE><DD>
 EM algorithm for Gaussian mixture model. 
<DT>
<CODE><a href="gmminit.htm">gmminit</a></CODE><DD>
 Initialises Gaussian mixture model from data 
<DT>
<CODE><a href="gmmpak.htm">gmmpak</a></CODE><DD>
 Combines all the parameters in a Gaussian mixture model into one vector. 
<DT>
<CODE><a href="gmmpost.htm">gmmpost</a></CODE><DD>
 Computes the class posterior probabilities of a Gaussian mixture model. 
<DT>
<CODE><a href="gmmprob.htm">gmmprob</a></CODE><DD>
 Computes the data probability for a Gaussian mixture model. 
<DT>
<CODE><a href="gmmsamp.htm">gmmsamp</a></CODE><DD>
 Sample from a Gaussian mixture distribution. 
<DT>
<CODE><a href="gmmunpak.htm">gmmunpak</a></CODE><DD>
 Separates a vector of Gaussian mixture model parameters into its components. 
<DT>
<CODE><a href="gp.htm">gp</a></CODE><DD>
 Create a Gaussian Process. 
<DT>
<CODE><a href="gpcovar.htm">gpcovar</a></CODE><DD>
 Calculate the covariance for a Gaussian Process. 
<DT>
<CODE><a href="gpcovarf.htm">gpcovarf</a></CODE><DD>
 Calculate the covariance function for a Gaussian Process. 
<DT>
<CODE><a href="gpcovarp.htm">gpcovarp</a></CODE><DD>
 Calculate the prior covariance for a Gaussian Process. 
<DT>
<CODE><a href="gperr.htm">gperr</a></CODE><DD>
 Evaluate error function for Gaussian Process. 
<DT>
<CODE><a href="gpfwd.htm">gpfwd</a></CODE><DD>
 Forward propagation through Gaussian Process. 
<DT>
<CODE><a href="gpgrad.htm">gpgrad</a></CODE><DD>
 Evaluate error gradient for Gaussian Process. 
<DT>
<CODE><a href="gpinit.htm">gpinit</a></CODE><DD>
 Initialise Gaussian Process model. 
<DT>
<CODE><a href="gppak.htm">gppak</a></CODE><DD>
 Combines GP hyperparameters into one vector. 
<DT>
<CODE><a href="gpunpak.htm">gpunpak</a></CODE><DD>
 Separates hyperparameter vector into components. 
<DT>
<CODE><a href="gradchek.htm">gradchek</a></CODE><DD>
 Checks a user-defined gradient function using finite differences. 
<DT>
<CODE><a href="graddesc.htm">graddesc</a></CODE><DD>
 Gradient descent optimization. 
<DT>
<CODE><a href="gsamp.htm">gsamp</a></CODE><DD>
 Sample from a Gaussian distribution. 
<DT>
<CODE><a href="gtm.htm">gtm</a></CODE><DD>
 Create a Generative Topographic Map. 
<DT>
<CODE><a href="gtmem.htm">gtmem</a></CODE><DD>
 EM algorithm for Generative Topographic Mapping. 
<DT>
<CODE><a href="gtmfwd.htm">gtmfwd</a></CODE><DD>
 Forward propagation through GTM. 
<DT>
<CODE><a href="gtminit.htm">gtminit</a></CODE><DD>
 Initialise the weights and latent sample in a GTM. 
<DT>
<CODE><a href="gtmlmean.htm">gtmlmean</a></CODE><DD>
 Mean responsibility for data in a GTM. 
<DT>
<CODE><a href="gtmlmode.htm">gtmlmode</a></CODE><DD>
 Mode responsibility for data in a GTM. 
<DT>
<CODE><a href="gtmmag.htm">gtmmag</a></CODE><DD>
 Magnification factors for a GTM 
<DT>
<CODE><a href="gtmpost.htm">gtmpost</a></CODE><DD>
 Latent space responsibility for data in a GTM. 
<DT>
<CODE><a href="gtmprob.htm">gtmprob</a></CODE><DD>
 Probability for data under a GTM. 
<DT>
<CODE><a href="hbayes.htm">hbayes</a></CODE><DD>
 Evaluate Hessian of Bayesian error function for network. 
<DT>
<CODE><a href="hesschek.htm">hesschek</a></CODE><DD>
 Use central differences to confirm correct evaluation of Hessian matrix. 
<DT>
<CODE><a href="hintmat.htm">hintmat</a></CODE><DD>
 Evaluates the coordinates of the patches for a Hinton diagram. 
<DT>
<CODE><a href="hinton.htm">hinton</a></CODE><DD>
 Plot Hinton diagram for a weight matrix. 
<DT>
<CODE><a href="histp.htm">histp</a></CODE><DD>
 Histogram estimate of 1-dimensional probability distribution. 
<DT>
<CODE><a href="hmc.htm">hmc</a></CODE><DD>
 Hybrid Monte Carlo sampling. 
<DT>
<CODE><a href="kmeans.htm">kmeans</a></CODE><DD>
 Trains a k means cluster model. 
<DT>
<CODE><a href="knn.htm">knn</a></CODE><DD>
 Creates a K-nearest-neighbour classifier. 
<DT>
<CODE><a href="knnfwd.htm">knnfwd</a></CODE><DD>
 Forward propagation through a K-nearest-neighbour classifier. 
<DT>
<CODE><a href="linef.htm">linef</a></CODE><DD>
 Calculate function value along a line. 
<DT>
<CODE><a href="linemin.htm">linemin</a></CODE><DD>
 One dimensional minimization. 
<DT>
<CODE><a href="maxitmess.htm">maxitmess</a></CODE><DD>
 Create a standard error message when training reaches max. iterations. 
<DT>
<CODE><a href="mdn.htm">mdn</a></CODE><DD>
 Creates a Mixture Density Network with specified architecture. 
<DT>
<CODE><a href="mdn2gmm.htm">mdn2gmm</a></CODE><DD>
 Converts an MDN mixture data structure to array of GMMs. 
<DT>
<CODE><a href="mdndist2.htm">mdndist2</a></CODE><DD>
 Calculates squared distance between centres of Gaussian kernels and data 
<DT>
<CODE><a href="mdnerr.htm">mdnerr</a></CODE><DD>
 Evaluate error function for Mixture Density Network. 
<DT>
<CODE><a href="mdnfwd.htm">mdnfwd</a></CODE><DD>
 Forward propagation through Mixture Density Network. 
<DT>
<CODE><a href="mdngrad.htm">mdngrad</a></CODE><DD>
 Evaluate gradient of error function for Mixture Density Network. 
<DT>
<CODE><a href="mdninit.htm">mdninit</a></CODE><DD>
 Initialise the weights in a Mixture Density Network. 
<DT>
<CODE><a href="mdnpak.htm">mdnpak</a></CODE><DD>
 Combines weights and biases into one weights vector. 
<DT>
<CODE><a href="mdnpost.htm">mdnpost</a></CODE><DD>
 Computes the posterior probability for each MDN mixture component. 
<DT>
<CODE><a href="mdnprob.htm">mdnprob</a></CODE><DD>
 Computes the data probability likelihood for an MDN mixture structure. 
<DT>
<CODE><a href="mdnunpak.htm">mdnunpak</a></CODE><DD>
 Separates weights vector into weight and bias matrices. 
<DT>
<CODE><a href="metrop.htm">metrop</a></CODE><DD>
 Markov Chain Monte Carlo sampling with Metropolis algorithm. 
<DT>
<CODE><a href="minbrack.htm">minbrack</a></CODE><DD>
 Bracket a minimum of a function of one variable. 
<DT>
<CODE><a href="mlp.htm">mlp</a></CODE><DD>
 Create a 2-layer feedforward network. 
<DT>
<CODE><a href="mlpbkp.htm">mlpbkp</a></CODE><DD>
 Backpropagate gradient of error function for 2-layer network. 
<DT>
<CODE><a href="mlpderiv.htm">mlpderiv</a></CODE><DD>
 Evaluate derivatives of network outputs with respect to weights. 
<DT>
<CODE><a href="mlperr.htm">mlperr</a></CODE><DD>
 Evaluate error function for 2-layer network. 
<DT>
<CODE><a href="mlpevfwd.htm">mlpevfwd</a></CODE><DD>
 Forward propagation with evidence for MLP 
<DT>
<CODE><a href="mlpfwd.htm">mlpfwd</a></CODE><DD>
 Forward propagation through 2-layer network. 
<DT>
<CODE><a href="mlpgrad.htm">mlpgrad</a></CODE><DD>
 Evaluate gradient of error function for 2-layer network. 
<DT>
<CODE><a href="mlphdotv.htm">mlphdotv</a></CODE><DD>
 Evaluate the product of the data Hessian with a vector. 
<DT>
<CODE><a href="mlphess.htm">mlphess</a></CODE><DD>
 Evaluate the Hessian matrix for a multi-layer perceptron network. 
<DT>
<CODE><a href="mlphint.htm">mlphint</a></CODE><DD>
 Plot Hinton diagram for 2-layer feed-forward network. 
<DT>
<CODE><a href="mlpinit.htm">mlpinit</a></CODE><DD>
 Initialise the weights in a 2-layer feedforward network. 
<DT>
<CODE><a href="mlppak.htm">mlppak</a></CODE><DD>
 Combines weights and biases into one weights vector. 
<DT>
<CODE><a href="mlpprior.htm">mlpprior</a></CODE><DD>
 Create Gaussian prior for mlp. 
<DT>
<CODE><a href="mlptrain.htm">mlptrain</a></CODE><DD>
 Utility to train an MLP network for demtrain 
<DT>
<CODE><a href="mlpunpak.htm">mlpunpak</a></CODE><DD>
 Separates weights vector into weight and bias matrices. 
<DT>
<CODE><a href="netderiv.htm">netderiv</a></CODE><DD>
 Evaluate derivatives of network outputs by weights generically. 
<DT>
<CODE><a href="neterr.htm">neterr</a></CODE><DD>
 Evaluate network error function for generic optimizers 
<DT>
<CODE><a href="netevfwd.htm">netevfwd</a></CODE><DD>
 Generic forward propagation with evidence for network 
<DT>
<CODE><a href="netgrad.htm">netgrad</a></CODE><DD>
 Evaluate network error gradient for generic optimizers 
<DT>
<CODE><a href="nethess.htm">nethess</a></CODE><DD>
 Evaluate network Hessian 
<DT>
<CODE><a href="netinit.htm">netinit</a></CODE><DD>
 Initialise the weights in a network. 
<DT>
<CODE><a href="netopt.htm">netopt</a></CODE><DD>
 Optimize the weights in a network model. 
<DT>
<CODE><a href="netpak.htm">netpak</a></CODE><DD>
 Combines weights and biases into one weights vector. 
<DT>
<CODE><a href="netunpak.htm">netunpak</a></CODE><DD>
 Separates weights vector into weight and bias matrices. 
<DT>
<CODE><a href="olgd.htm">olgd</a></CODE><DD>
 On-line gradient descent optimization. 
<DT>
<CODE><a href="pca.htm">pca</a></CODE><DD>
 Principal Components Analysis 
<DT>
<CODE><a href="plotmat.htm">plotmat</a></CODE><DD>
 Display a matrix. 
<DT>
<CODE><a href="ppca.htm">ppca</a></CODE><DD>
 Probabilistic Principal Components Analysis 
<DT>
<CODE><a href="quasinew.htm">quasinew</a></CODE><DD>
 Quasi-Newton optimization. 
<DT>
<CODE><a href="rbf.htm">rbf</a></CODE><DD>
 Creates an RBF network with specified architecture 
<DT>
<CODE><a href="rbfbkp.htm">rbfbkp</a></CODE><DD>
 Backpropagate gradient of error function for RBF network. 
<DT>
<CODE><a href="rbfderiv.htm">rbfderiv</a></CODE><DD>
 Evaluate derivatives of RBF network outputs with respect to weights. 
<DT>
<CODE><a href="rbferr.htm">rbferr</a></CODE><DD>
 Evaluate error function for RBF network. 
<DT>
<CODE><a href="rbfevfwd.htm">rbfevfwd</a></CODE><DD>
 Forward propagation with evidence for RBF 
<DT>
<CODE><a href="rbffwd.htm">rbffwd</a></CODE><DD>
 Forward propagation through RBF network with linear outputs. 
<DT>
<CODE><a href="rbfgrad.htm">rbfgrad</a></CODE><DD>
 Evaluate gradient of error function for RBF network. 
<DT>
<CODE><a href="rbfhess.htm">rbfhess</a></CODE><DD>
 Evaluate the Hessian matrix for RBF network. 
<DT>
<CODE><a href="rbfjacob.htm">rbfjacob</a></CODE><DD>
 Evaluate derivatives of RBF network outputs with respect to inputs. 
<DT>
<CODE><a href="rbfpak.htm">rbfpak</a></CODE><DD>
 Combines all the parameters in an RBF network into one weights vector. 
<DT>
<CODE><a href="rbfprior.htm">rbfprior</a></CODE><DD>
 Create Gaussian prior and output layer mask for RBF. 
<DT>
<CODE><a href="rbfsetbf.htm">rbfsetbf</a></CODE><DD>
 Set basis functions of RBF from data. 
<DT>
<CODE><a href="rbfsetfw.htm">rbfsetfw</a></CODE><DD>
 Set basis function widths of RBF. 
<DT>
<CODE><a href="rbftrain.htm">rbftrain</a></CODE><DD>
 Two stage training of RBF network. 
<DT>
<CODE><a href="rbfunpak.htm">rbfunpak</a></CODE><DD>
 Separates a vector of RBF weights into its components. 
<DT>
<CODE><a href="rosegrad.htm">rosegrad</a></CODE><DD>
 Calculate gradient of Rosenbrock's function. 
<DT>
<CODE><a href="rosen.htm">rosen</a></CODE><DD>
 Calculate Rosenbrock's function. 
<DT>
<CODE><a href="scg.htm">scg</a></CODE><DD>
 Scaled conjugate gradient optimization. 
<DT>
<CODE><a href="som.htm">som</a></CODE><DD>
 Creates a Self-Organising Map. 
<DT>
<CODE><a href="somfwd.htm">somfwd</a></CODE><DD>
 Forward propagation through a Self-Organising Map. 
<DT>
<CODE><a href="sompak.htm">sompak</a></CODE><DD>
 Combines node weights into one weights matrix. 
<DT>
<CODE><a href="somtrain.htm">somtrain</a></CODE><DD>
 Kohonen training algorithm for SOM. 
<DT>
<CODE><a href="somunpak.htm">somunpak</a></CODE><DD>
 Replaces node weights in SOM. 
</DL>

<hr>
<p>Copyright (c) Christopher M Bishop, Ian T Nabney (1996, 1997)
</body>
</html>