annotate core/tools/kldiv.m @ 0:e9a9cd732c1e tip

first hg version after svn
author wolffd
date Tue, 10 Feb 2015 15:05:51 +0000
parents
children
rev   line source
wolffd@0 1 function KL = kldiv(varValue,pVect1,pVect2,varargin)
wolffd@0 2 %KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions.
wolffd@0 3 % KLDIV(X,P1,P2) returns the Kullback-Leibler divergence between two
wolffd@0 4 % distributions specified over the M variable values in vector X. P1 is a
wolffd@0 5 % length-M vector of probabilities representing distribution 1, and P2 is a
wolffd@0 6 % length-M vector of probabilities representing distribution 2. Thus, the
wolffd@0 7 % probability of value X(i) is P1(i) for distribution 1 and P2(i) for
wolffd@0 8 % distribution 2. The Kullback-Leibler divergence is given by:
wolffd@0 9 %
wolffd@0 10 % KL(P1(x),P2(x)) = sum[P1(x).log(P1(x)/P2(x))]
wolffd@0 11 %
wolffd@0 12 % If X contains duplicate values, there will be an warning message, and these
wolffd@0 13 % values will be treated as distinct values. (I.e., the actual values do
wolffd@0 14 % not enter into the computation, but the probabilities for the two
wolffd@0 15 % duplicate values will be considered as probabilities corresponding to
wolffd@0 16 % two unique values.) The elements of probability vectors P1 and P2 must
wolffd@0 17 % each sum to 1 +/- .00001.
wolffd@0 18 %
wolffd@0 19 % A "log of zero" warning will be thrown for zero-valued probabilities.
wolffd@0 20 % Handle this however you wish. Adding 'eps' or some other small value
wolffd@0 21 % to all probabilities seems reasonable. (Renormalize if necessary.)
wolffd@0 22 %
wolffd@0 23 % KLDIV(X,P1,P2,'sym') returns a symmetric variant of the Kullback-Leibler
wolffd@0 24 % divergence, given by [KL(P1,P2)+KL(P2,P1)]/2. See Johnson and Sinanovic
wolffd@0 25 % (2001).
wolffd@0 26 %
wolffd@0 27 % KLDIV(X,P1,P2,'js') returns the Jensen-Shannon divergence, given by
wolffd@0 28 % [KL(P1,Q)+KL(P2,Q)]/2, where Q = (P1+P2)/2. See the Wikipedia article
wolffd@0 29 % for "Kullback–Leibler divergence". This is equal to 1/2 the so-called
wolffd@0 30 % "Jeffrey divergence." See Rubner et al. (2000).
wolffd@0 31 %
wolffd@0 32 % EXAMPLE: Let the event set and probability sets be as follow:
wolffd@0 33 % X = [1 2 3 3 4]';
wolffd@0 34 % P1 = ones(5,1)/5;
wolffd@0 35 % P2 = [0 0 .5 .2 .3]' + eps;
wolffd@0 36 %
wolffd@0 37 % Note that the event set here has duplicate values (two 3's). These
wolffd@0 38 % will be treated as DISTINCT events by KLDIV. If you want these to
wolffd@0 39 % be treated as the SAME event, you will need to collapse their
wolffd@0 40 % probabilities together before running KLDIV. One way to do this
wolffd@0 41 % is to use UNIQUE to find the set of unique events, and then
wolffd@0 42 % iterate over that set, summing probabilities for each instance of
wolffd@0 43 % each unique event. Here, we just leave the duplicate values to be
wolffd@0 44 % treated independently (the default):
wolffd@0 45 % KL = kldiv(X,P1,P2);
wolffd@0 46 % KL =
wolffd@0 47 % 19.4899
wolffd@0 48 %
wolffd@0 49 % Note also that we avoided the log-of-zero warning by adding 'eps'
wolffd@0 50 % to all probability values in P2. We didn't need to renormalize
wolffd@0 51 % because we're still within the sum-to-one tolerance.
wolffd@0 52 %
wolffd@0 53 % REFERENCES:
wolffd@0 54 % 1) Cover, T.M. and J.A. Thomas. "Elements of Information Theory," Wiley,
wolffd@0 55 % 1991.
wolffd@0 56 % 2) Johnson, D.H. and S. Sinanovic. "Symmetrizing the Kullback-Leibler
wolffd@0 57 % distance." IEEE Transactions on Information Theory (Submitted).
wolffd@0 58 % 3) Rubner, Y., Tomasi, C., and Guibas, L. J., 2000. "The Earth Mover's
wolffd@0 59 % distance as a metric for image retrieval." International Journal of
wolffd@0 60 % Computer Vision, 40(2): 99-121.
wolffd@0 61 % 4) <a href="matlab:web('http://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence','-browser')">Kullback–Leibler divergence</a>. Wikipedia, The Free Encyclopedia.
wolffd@0 62 %
wolffd@0 63 % See also: MUTUALINFO, ENTROPY
wolffd@0 64
wolffd@0 65 if ~isequal(unique(varValue),sort(varValue)),
wolffd@0 66 warning('KLDIV:duplicates','X contains duplicate values. Treated as distinct values.')
wolffd@0 67 end
wolffd@0 68 if ~isequal(size(varValue),size(pVect1)) || ~isequal(size(varValue),size(pVect2)),
wolffd@0 69 error('All inputs must have same dimension.')
wolffd@0 70 end
wolffd@0 71 % Check probabilities sum to 1:
wolffd@0 72 if (abs(sum(pVect1) - 1) > .00001) || (abs(sum(pVect2) - 1) > .00001),
wolffd@0 73 error('Probablities don''t sum to 1.')
wolffd@0 74 end
wolffd@0 75
wolffd@0 76 if ~isempty(varargin),
wolffd@0 77 switch varargin{1},
wolffd@0 78 case 'js',
wolffd@0 79 logQvect = log2((pVect2+pVect1)/2);
wolffd@0 80 KL = .5 * (sum(pVect1.*(log2(pVect1)-logQvect)) + ...
wolffd@0 81 sum(pVect2.*(log2(pVect2)-logQvect)));
wolffd@0 82
wolffd@0 83 case 'sym',
wolffd@0 84 KL1 = sum(pVect1 .* (log2(pVect1)-log2(pVect2)));
wolffd@0 85 KL2 = sum(pVect2 .* (log2(pVect2)-log2(pVect1)));
wolffd@0 86 KL = (KL1+KL2)/2;
wolffd@0 87
wolffd@0 88 otherwise
wolffd@0 89 error(['Last argument' ' "' varargin{1} '" ' 'not recognized.'])
wolffd@0 90 end
wolffd@0 91 else
wolffd@0 92 KL = sum(pVect1 .* (log2(pVect1)-log2(pVect2)));
wolffd@0 93 end
wolffd@0 94
wolffd@0 95
wolffd@0 96
wolffd@0 97
wolffd@0 98
wolffd@0 99
wolffd@0 100
wolffd@0 101