matthiasm@8: function dist=JSDiv(P,Q) matthiasm@8: % Jensen-Shannon divergence of two probability distributions matthiasm@8: % dist = JSD(P,Q) Kullback-Leibler divergence of two discrete probability matthiasm@8: % distributions matthiasm@8: % P and Q are automatically normalised to have the sum of one on rows matthiasm@8: % have the length of one at each matthiasm@8: % P = n x nbins matthiasm@8: % Q = 1 x nbins matthiasm@8: % dist = n x 1 matthiasm@8: matthiasm@8: matthiasm@8: if size(P,2)~=size(Q,2) matthiasm@8: error('the number of columns in P and Q should be the same'); matthiasm@8: end matthiasm@8: matthiasm@8: % normalizing the P and Q matthiasm@8: Q = Q ./sum(Q); matthiasm@8: Q = repmat(Q,[size(P,1) 1]); matthiasm@8: P = P ./repmat(sum(P,2),[1 size(P,2)]); matthiasm@8: matthiasm@8: M = 0.5.*(P + Q); matthiasm@8: matthiasm@8: dist = 0.5.*KLDiv(P,M) + 0.5*KLDiv(Q,M);