Mercurial > hg > camir-aes2014
diff toolboxes/FullBNT-1.0.7/graph/minimum_spanning_tree.m @ 0:e9a9cd732c1e tip
first hg version after svn
author | wolffd |
---|---|
date | Tue, 10 Feb 2015 15:05:51 +0000 |
parents | |
children |
line wrap: on
line diff
--- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/toolboxes/FullBNT-1.0.7/graph/minimum_spanning_tree.m Tue Feb 10 15:05:51 2015 +0000 @@ -0,0 +1,51 @@ +function A = minimum_spanning_tree(C1, C2) +% +% Find the minimum spanning tree using Prim's algorithm. +% C1(i,j) is the primary cost of connecting i to j. +% C2(i,j) is the (optional) secondary cost of connecting i to j, used to break ties. +% We assume that absent edges have 0 cost. +% To find the maximum spanning tree, used -1*C. +% See Aho, Hopcroft & Ullman 1983, "Data structures and algorithms", p 237. + +% Prim's is O(V^2). Kruskal's algorithm is O(E log E) and hence is more efficient +% for sparse graphs, but is implemented in terms of a priority queue. + +% We partition the nodes into those in U and those not in U. +% closest(i) is the vertex in U that is closest to i in V-U. +% lowcost(i) is the cost of the edge (i, closest(i)), or infinity is i has been used. +% In Aho, they say C(i,j) should be "some appropriate large value" if the edge is missing. +% We set it to infinity. +% However, since lowcost is initialized from C, we must distinguish absent edges from used nodes. + +n = length(C1); +if nargin==1, C2 = zeros(n); end +A = zeros(n); + +closest = ones(1,n); +used = zeros(1,n); % contains the members of U +used(1) = 1; % start with node 1 +C1(find(C1==0))=inf; +C2(find(C2==0))=inf; +lowcost1 = C1(1,:); +lowcost2 = C2(1,:); + +for i=2:n + ks = find(lowcost1==min(lowcost1)); + k = ks(argmin(lowcost2(ks))); + A(k, closest(k)) = 1; + A(closest(k), k) = 1; + lowcost1(k) = inf; + lowcost2(k) = inf; + used(k) = 1; + NU = find(used==0); + for ji=1:length(NU) + for j=NU(ji) + if C1(k,j) < lowcost1(j) + lowcost1(j) = C1(k,j); + lowcost2(j) = C2(k,j); + closest(j) = k; + end + end + end +end +