matthiasm@8: function kl = cross_entropy(p, q, symmetric) matthiasm@8: % CROSS_ENTROPY Compute the Kullback-Leibler divergence between two discrete prob. distributions matthiasm@8: % kl = cross_entropy(p, q, symmetric) matthiasm@8: % matthiasm@8: % If symmetric = 1, we compute the symmetric version. Default: symmetric = 0; matthiasm@8: matthiasm@8: tiny = exp(-700); matthiasm@8: if nargin < 3, symmetric = 0; end matthiasm@8: p = p(:); matthiasm@8: q = q(:); matthiasm@8: if symmetric matthiasm@8: kl = (sum(p .* log((p+tiny)./(q+tiny))) + sum(q .* log((q+tiny)./(p+tiny))))/2; matthiasm@8: else matthiasm@8: kl = sum(p .* log((p+tiny)./(q+tiny))); matthiasm@8: end