Daniel@0: Daniel@0: Daniel@0: Daniel@0: Netlab Reference Manual knnfwd Daniel@0: Daniel@0: Daniel@0: Daniel@0:

knnfwd Daniel@0:

Daniel@0:

Daniel@0: Purpose Daniel@0:

Daniel@0: Forward propagation through a K-nearest-neighbour classifier. Daniel@0: Daniel@0:

Daniel@0: Synopsis Daniel@0:

Daniel@0:
Daniel@0: 
Daniel@0: [y, l] = knnfwd(net, x)
Daniel@0: 
Daniel@0: Daniel@0: Daniel@0:

Daniel@0: Description Daniel@0:

Daniel@0: [y, l] = knnfwd(net, x) takes a matrix x Daniel@0: of input vectors (one vector per row) Daniel@0: and uses the k-nearest-neighbour rule on the training data contained Daniel@0: in net to Daniel@0: produce Daniel@0: a matrix y of outputs and a matrix l of classification Daniel@0: labels. Daniel@0: The nearest neighbours are determined using Euclidean distance. Daniel@0: The ijth entry of y counts the number of occurrences that Daniel@0: an example from class j is among the k closest training Daniel@0: examples to example i from x. Daniel@0: The matrix l contains the predicted class labels Daniel@0: as an index 1..N, not as 1-of-N coding. Daniel@0: Daniel@0:

Daniel@0: Example Daniel@0:

Daniel@0:
Daniel@0: 
Daniel@0: net = knn(size(xtrain, 2), size(t_train, 2), 3, xtrain, t_train);
Daniel@0: y = knnfwd(net, xtest);
Daniel@0: conffig(y, t_test);
Daniel@0: 
Daniel@0: Daniel@0: Creates a 3 nearest neighbour model net and then applies it to Daniel@0: the data xtest. The results are plotted as a confusion matrix with Daniel@0: conffig. Daniel@0: Daniel@0:

Daniel@0: See Also Daniel@0:

Daniel@0: kmeans, knn
Daniel@0: Pages: Daniel@0: Index Daniel@0:
Daniel@0:

Copyright (c) Ian T Nabney (1996-9) Daniel@0: Daniel@0: Daniel@0: Daniel@0: