wolffd@0: wolffd@0:
wolffd@0:wolffd@0: wolffd@0: [y, l] = knnfwd(net, x) wolffd@0:wolffd@0: wolffd@0: wolffd@0:
[y, l] = knnfwd(net, x)
takes a matrix x
wolffd@0: of input vectors (one vector per row)
wolffd@0: and uses the k
-nearest-neighbour rule on the training data contained
wolffd@0: in net
to
wolffd@0: produce
wolffd@0: a matrix y
of outputs and a matrix l
of classification
wolffd@0: labels.
wolffd@0: The nearest neighbours are determined using Euclidean distance.
wolffd@0: The ij
th entry of y
counts the number of occurrences that
wolffd@0: an example from class j
is among the k
closest training
wolffd@0: examples to example i
from x
.
wolffd@0: The matrix l
contains the predicted class labels
wolffd@0: as an index 1..N, not as 1-of-N coding.
wolffd@0:
wolffd@0: wolffd@0: wolffd@0: net = knn(size(xtrain, 2), size(t_train, 2), 3, xtrain, t_train); wolffd@0: y = knnfwd(net, xtest); wolffd@0: conffig(y, t_test); wolffd@0:wolffd@0: wolffd@0: Creates a 3 nearest neighbour model
net
and then applies it to
wolffd@0: the data xtest
. The results are plotted as a confusion matrix with
wolffd@0: conffig
.
wolffd@0:
wolffd@0: kmeans
, knn
Copyright (c) Ian T Nabney (1996-9) wolffd@0: wolffd@0: wolffd@0: wolffd@0: