comparison toolboxes/FullBNT-1.0.7/nethelp3.3/knnfwd.htm @ 0:e9a9cd732c1e tip

first hg version after svn
author wolffd
date Tue, 10 Feb 2015 15:05:51 +0000
parents
children
comparison
equal deleted inserted replaced
-1:000000000000 0:e9a9cd732c1e
1 <html>
2 <head>
3 <title>
4 Netlab Reference Manual knnfwd
5 </title>
6 </head>
7 <body>
8 <H1> knnfwd
9 </H1>
10 <h2>
11 Purpose
12 </h2>
13 Forward propagation through a K-nearest-neighbour classifier.
14
15 <p><h2>
16 Synopsis
17 </h2>
18 <PRE>
19
20 [y, l] = knnfwd(net, x)
21 </PRE>
22
23
24 <p><h2>
25 Description
26 </h2>
27 <CODE>[y, l] = knnfwd(net, x)</CODE> takes a matrix <CODE>x</CODE>
28 of input vectors (one vector per row)
29 and uses the <CODE>k</CODE>-nearest-neighbour rule on the training data contained
30 in <CODE>net</CODE> to
31 produce
32 a matrix <CODE>y</CODE> of outputs and a matrix <CODE>l</CODE> of classification
33 labels.
34 The nearest neighbours are determined using Euclidean distance.
35 The <CODE>ij</CODE>th entry of <CODE>y</CODE> counts the number of occurrences that
36 an example from class <CODE>j</CODE> is among the <CODE>k</CODE> closest training
37 examples to example <CODE>i</CODE> from <CODE>x</CODE>.
38 The matrix <CODE>l</CODE> contains the predicted class labels
39 as an index 1..N, not as 1-of-N coding.
40
41 <p><h2>
42 Example
43 </h2>
44 <PRE>
45
46 net = knn(size(xtrain, 2), size(t_train, 2), 3, xtrain, t_train);
47 y = knnfwd(net, xtest);
48 conffig(y, t_test);
49 </PRE>
50
51 Creates a 3 nearest neighbour model <CODE>net</CODE> and then applies it to
52 the data <CODE>xtest</CODE>. The results are plotted as a confusion matrix with
53 <CODE>conffig</CODE>.
54
55 <p><h2>
56 See Also
57 </h2>
58 <CODE><a href="kmeans.htm">kmeans</a></CODE>, <CODE><a href="knn.htm">knn</a></CODE><hr>
59 <b>Pages:</b>
60 <a href="index.htm">Index</a>
61 <hr>
62 <p>Copyright (c) Ian T Nabney (1996-9)
63
64
65 </body>
66 </html>