matthiasm@8: function dist=KLDiv(P,Q) matthiasm@8: % dist = KLDiv(P,Q) Kullback-Leibler divergence of two discrete probability matthiasm@8: % distributions matthiasm@8: % P and Q are automatically normalised to have the sum of one on rows matthiasm@8: % have the length of one at each matthiasm@8: % P = n x nbins matthiasm@8: % Q = 1 x nbins or n x nbins(one to one) matthiasm@8: % dist = n x 1 matthiasm@8: matthiasm@8: matthiasm@8: matthiasm@8: if size(P,2)~=size(Q,2) matthiasm@8: error('the number of columns in P and Q should be the same'); matthiasm@8: end matthiasm@8: matthiasm@8: if sum(~isfinite(P(:))) + sum(~isfinite(Q(:))) matthiasm@8: error('the inputs contain non-finite values!') matthiasm@8: end matthiasm@8: matthiasm@8: % normalizing the P and Q matthiasm@8: if size(Q,1)==1 matthiasm@8: Q = Q ./sum(Q); matthiasm@8: P = P ./repmat(sum(P,2),[1 size(P,2)]); matthiasm@8: dist = sum(P.*log(P./repmat(Q,[size(P,1) 1])),2); matthiasm@8: matthiasm@8: elseif size(Q,1)==size(P,1) matthiasm@8: matthiasm@8: Q = Q ./repmat(sum(Q,2),[1 size(Q,2)]); matthiasm@8: P = P ./repmat(sum(P,2),[1 size(P,2)]); matthiasm@8: dist = sum(P.*log(P./Q),2); matthiasm@8: end matthiasm@8: matthiasm@8: % resolving the case when P(i)==0 matthiasm@8: dist(isnan(dist))=0; matthiasm@8: