wolffd@0: wolffd@0: wolffd@0: wolffd@0: Netlab Reference Manual mdn wolffd@0: wolffd@0: wolffd@0: wolffd@0:

mdn wolffd@0:

wolffd@0:

wolffd@0: Purpose wolffd@0:

wolffd@0: Creates a Mixture Density Network with specified architecture. wolffd@0: wolffd@0:

wolffd@0: Synopsis wolffd@0:

wolffd@0:
wolffd@0: net = mdn(nin, nhidden, ncentres, dimtarget)
wolffd@0: net = mdn(nin, nhidden, ncentres, dimtarget, mixtype, ...
wolffd@0: 	prior, beta)
wolffd@0: 
wolffd@0: wolffd@0: wolffd@0:

wolffd@0: Description wolffd@0:

wolffd@0: net = mdn(nin, nhidden, ncentres, dimtarget) takes the number of wolffd@0: inputs, wolffd@0: hidden units for a 2-layer feed-forward wolffd@0: network and the number of centres and target dimension for the wolffd@0: mixture model whose parameters are set from the outputs of the neural network. wolffd@0: The fifth argument mixtype is used to define the type of mixture wolffd@0: model. (Currently there is only one type supported: a mixture of Gaussians with wolffd@0: a single covariance parameter for each component.) For this model, wolffd@0: the mixture coefficients are computed from a group of softmax outputs, wolffd@0: the centres are equal to a group of linear outputs, and the variances are wolffd@0: obtained by applying the exponential function to a third group of outputs. wolffd@0: wolffd@0:

The network is initialised by a call to mlp, and the arguments wolffd@0: prior, and beta have the same role as for that function. wolffd@0: Weight initialisation uses the Matlab function randn wolffd@0: and so the seed for the random weight initialization can be wolffd@0: set using randn('state', s) where s is the seed value. wolffd@0: A specialised data structure (rather than gmm) wolffd@0: is used for the mixture model outputs to improve wolffd@0: the efficiency of error and gradient calculations in network training. wolffd@0: The fields are described in mdnfwd where they are set up. wolffd@0: wolffd@0:

The fields in net are wolffd@0:

wolffd@0:   
wolffd@0:   type = 'mdn'
wolffd@0:   nin = number of input variables
wolffd@0:   nout = dimension of target space (not number of network outputs)
wolffd@0:   nwts = total number of weights and biases
wolffd@0:   mdnmixes = data structure for mixture model output
wolffd@0:   mlp = data structure for MLP network
wolffd@0: 
wolffd@0: wolffd@0: wolffd@0:

wolffd@0: Example wolffd@0:

wolffd@0:
wolffd@0: 
wolffd@0: net = mdn(2, 4, 3, 1, 'spherical');
wolffd@0: 
wolffd@0: wolffd@0: This creates a Mixture Density Network with 2 inputs and 4 hidden units. wolffd@0: The mixture model has 3 components and the target space has dimension 1. wolffd@0: wolffd@0:

wolffd@0: See Also wolffd@0:

wolffd@0: mdnfwd, mdnerr, mdn2gmm, mdngrad, mdnpak, mdnunpak, mlp
wolffd@0: Pages: wolffd@0: Index wolffd@0:
wolffd@0:

Copyright (c) Ian T Nabney (1996-9) wolffd@0:

David J Evans (1998) wolffd@0: wolffd@0: wolffd@0: