comparison toolboxes/FullBNT-1.0.7/KPMtools/cross_entropy.m @ 0:e9a9cd732c1e tip

first hg version after svn
author wolffd
date Tue, 10 Feb 2015 15:05:51 +0000
parents
children
comparison
equal deleted inserted replaced
-1:000000000000 0:e9a9cd732c1e
1 function kl = cross_entropy(p, q, symmetric)
2 % CROSS_ENTROPY Compute the Kullback-Leibler divergence between two discrete prob. distributions
3 % kl = cross_entropy(p, q, symmetric)
4 %
5 % If symmetric = 1, we compute the symmetric version. Default: symmetric = 0;
6
7 tiny = exp(-700);
8 if nargin < 3, symmetric = 0; end
9 p = p(:);
10 q = q(:);
11 if symmetric
12 kl = (sum(p .* log((p+tiny)./(q+tiny))) + sum(q .* log((q+tiny)./(p+tiny))))/2;
13 else
14 kl = sum(p .* log((p+tiny)./(q+tiny)));
15 end