Mercurial > hg > camir-aes2014
diff toolboxes/FullBNT-1.0.7/nethelp3.3/knnfwd.htm @ 0:e9a9cd732c1e tip
first hg version after svn
author | wolffd |
---|---|
date | Tue, 10 Feb 2015 15:05:51 +0000 |
parents | |
children |
line wrap: on
line diff
--- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/toolboxes/FullBNT-1.0.7/nethelp3.3/knnfwd.htm Tue Feb 10 15:05:51 2015 +0000 @@ -0,0 +1,66 @@ +<html> +<head> +<title> +Netlab Reference Manual knnfwd +</title> +</head> +<body> +<H1> knnfwd +</H1> +<h2> +Purpose +</h2> +Forward propagation through a K-nearest-neighbour classifier. + +<p><h2> +Synopsis +</h2> +<PRE> + +[y, l] = knnfwd(net, x) +</PRE> + + +<p><h2> +Description +</h2> +<CODE>[y, l] = knnfwd(net, x)</CODE> takes a matrix <CODE>x</CODE> +of input vectors (one vector per row) + and uses the <CODE>k</CODE>-nearest-neighbour rule on the training data contained +in <CODE>net</CODE> to +produce +a matrix <CODE>y</CODE> of outputs and a matrix <CODE>l</CODE> of classification +labels. +The nearest neighbours are determined using Euclidean distance. +The <CODE>ij</CODE>th entry of <CODE>y</CODE> counts the number of occurrences that +an example from class <CODE>j</CODE> is among the <CODE>k</CODE> closest training +examples to example <CODE>i</CODE> from <CODE>x</CODE>. +The matrix <CODE>l</CODE> contains the predicted class labels +as an index 1..N, not as 1-of-N coding. + +<p><h2> +Example +</h2> +<PRE> + +net = knn(size(xtrain, 2), size(t_train, 2), 3, xtrain, t_train); +y = knnfwd(net, xtest); +conffig(y, t_test); +</PRE> + +Creates a 3 nearest neighbour model <CODE>net</CODE> and then applies it to +the data <CODE>xtest</CODE>. The results are plotted as a confusion matrix with +<CODE>conffig</CODE>. + +<p><h2> +See Also +</h2> +<CODE><a href="kmeans.htm">kmeans</a></CODE>, <CODE><a href="knn.htm">knn</a></CODE><hr> +<b>Pages:</b> +<a href="index.htm">Index</a> +<hr> +<p>Copyright (c) Ian T Nabney (1996-9) + + +</body> +</html> \ No newline at end of file