annotate toolboxes/FullBNT-1.0.7/netlab3.3/knnfwd.m @ 0:e9a9cd732c1e tip

first hg version after svn
author wolffd
date Tue, 10 Feb 2015 15:05:51 +0000
parents
children
rev   line source
wolffd@0 1 function [y, l] = knnfwd(net, x)
wolffd@0 2 %KNNFWD Forward propagation through a K-nearest-neighbour classifier.
wolffd@0 3 %
wolffd@0 4 % Description
wolffd@0 5 % [Y, L] = KNNFWD(NET, X) takes a matrix X of input vectors (one vector
wolffd@0 6 % per row) and uses the K-nearest-neighbour rule on the training data
wolffd@0 7 % contained in NET to produce a matrix Y of outputs and a matrix L of
wolffd@0 8 % classification labels. The nearest neighbours are determined using
wolffd@0 9 % Euclidean distance. The IJth entry of Y counts the number of
wolffd@0 10 % occurrences that an example from class J is among the K closest
wolffd@0 11 % training examples to example I from X. The matrix L contains the
wolffd@0 12 % predicted class labels as an index 1..N, not as 1-of-N coding.
wolffd@0 13 %
wolffd@0 14 % See also
wolffd@0 15 % KMEANS, KNN
wolffd@0 16 %
wolffd@0 17
wolffd@0 18 % Copyright (c) Ian T Nabney (1996-2001)
wolffd@0 19
wolffd@0 20
wolffd@0 21 errstring = consist(net, 'knn', x);
wolffd@0 22 if ~isempty(errstring)
wolffd@0 23 error(errstring);
wolffd@0 24 end
wolffd@0 25
wolffd@0 26 ntest = size(x, 1); % Number of input vectors.
wolffd@0 27 nclass = size(net.tr_targets, 2); % Number of classes.
wolffd@0 28
wolffd@0 29 % Compute matrix of squared distances between input vectors from the training
wolffd@0 30 % and test sets. The matrix distsq has dimensions (ntrain, ntest).
wolffd@0 31
wolffd@0 32 distsq = dist2(net.tr_in, x);
wolffd@0 33
wolffd@0 34 % Now sort the distances. This generates a matrix kind of the same
wolffd@0 35 % dimensions as distsq, in which each column gives the indices of the
wolffd@0 36 % elements in the corresponding column of distsq in ascending order.
wolffd@0 37
wolffd@0 38 [vals, kind] = sort(distsq);
wolffd@0 39 y = zeros(ntest, nclass);
wolffd@0 40
wolffd@0 41 for k=1:net.k
wolffd@0 42 % We now look at the predictions made by the Kth nearest neighbours alone,
wolffd@0 43 % and represent this as a 1-of-N coded matrix, and then accumulate the
wolffd@0 44 % predictions so far.
wolffd@0 45
wolffd@0 46 y = y + net.tr_targets(kind(k,:),:);
wolffd@0 47
wolffd@0 48 end
wolffd@0 49
wolffd@0 50 if nargout == 2
wolffd@0 51 % Convert this set of outputs to labels, randomly breaking ties
wolffd@0 52 [temp, l] = max((y + 0.1*rand(size(y))), [], 2);
wolffd@0 53 end