annotate matlab/RecomTST/RecommendedTST.m @ 68:cab8a215f9a1 tip

Minor
author Nic Cleju <nikcleju@gmail.com>
date Tue, 09 Jul 2013 14:50:09 +0300
parents 4393ad5bffc1
children
rev   line source
nikcleju@4 1 function beta = RecommendedTST(X,Y, nsweep,tol,xinitial,ro)
nikcleju@4 2
nikcleju@4 3 % function beta=RecommendedTST(X,y, nsweep,tol,xinitial,ro)
nikcleju@4 4 % This function gets the measurement matrix and the measurements and
nikcleju@4 5 % the number of runs and applies the TST algorithm with optimally tuned parameters
nikcleju@4 6 % to the problem. For more information you may refer to the paper,
nikcleju@4 7 % "Optimally tuned iterative reconstruction algorithms for compressed
nikcleju@4 8 % sensing," by Arian Maleki and David L. Donoho.
nikcleju@4 9 % X : Measurement matrix; We assume that all the columns have
nikcleju@4 10 % almost equal $\ell_2$ norms. The tunning has been done on
nikcleju@4 11 % matrices with unit column norm.
nikcleju@4 12 % y : output vector
nikcleju@4 13 % nsweep : number of iterations. The default value is 300.
nikcleju@4 14 % tol : if the relative prediction error i.e. ||Y-Ax||_2/ ||Y||_2 <
nikcleju@4 15 % tol the algorithm will stop. If not provided the default
nikcleju@4 16 % value is zero and tha algorithm will run for nsweep
nikcleju@4 17 % iterations. The Default value is 0.00001.
nikcleju@4 18 % xinitial : This is an optional parameter. It will be used as an
nikcleju@4 19 % initialization of the algorithm. All the results mentioned
nikcleju@4 20 % in the paper have used initialization at the zero vector
nikcleju@4 21 % which is our default value. For default value you can enter
nikcleju@4 22 % 0 here.
nikcleju@4 23 % ro : This is a again an optional parameter. If not given the
nikcleju@4 24 % algorithm will use the default optimal values. It specifies
nikcleju@4 25 % the sparsity level. For the default value you may also used if
nikcleju@4 26 % rostar=0;
nikcleju@4 27 %
nikcleju@4 28 % Outputs:
nikcleju@4 29 % beta : the estimated coeffs.
nikcleju@4 30 %
nikcleju@4 31 % References:
nikcleju@4 32 % For more information about this algorithm and to find the papers about
nikcleju@4 33 % related algorithms like CoSaMP and SP please refer to the paper mentioned
nikcleju@4 34 % above and the references of that paper.
nikcleju@4 35
nikcleju@4 36
nikcleju@4 37 colnorm=mean((sum(X.^2)).^(.5));
nikcleju@4 38 X=X./colnorm;
nikcleju@4 39 Y=Y./colnorm;
nikcleju@4 40 [n,p]=size(X);
nikcleju@4 41 delta=n/p;
nikcleju@4 42 if nargin<3
nikcleju@4 43 nsweep=300;
nikcleju@4 44 end
nikcleju@4 45 if nargin<4
nikcleju@4 46 tol=0.00001;
nikcleju@4 47 end
nikcleju@4 48 if nargin<5 | xinitial==0
nikcleju@4 49 xinitial = zeros(p,1);
nikcleju@4 50 end
nikcleju@4 51 if nargin<6 | ro==0
nikcleju@4 52 ro=0.044417*delta^2+ 0.34142*delta+0.14844;
nikcleju@4 53 end
nikcleju@4 54
nikcleju@4 55
nikcleju@4 56 k1=floor(ro*n);
nikcleju@4 57 k2=floor(ro*n);
nikcleju@4 58
nikcleju@4 59
nikcleju@4 60 %initialization
nikcleju@4 61 x1=xinitial;
nikcleju@4 62 I=[];
nikcleju@4 63
nikcleju@4 64 for sweep=1:nsweep
nikcleju@4 65 r=Y-X*x1;
nikcleju@4 66 c=X'*r;
nikcleju@4 67 [csort, i_csort]=sort(abs(c));
nikcleju@4 68 I=union(I,i_csort(end-k2+1:end));
nikcleju@4 69 xt = X(:,I) \ Y;
nikcleju@4 70 [xtsort, i_xtsort]=sort(abs(xt));
nikcleju@4 71
nikcleju@4 72 J=I(i_xtsort(end-k1+1:end));
nikcleju@4 73 x1=zeros(p,1);
nikcleju@4 74 x1(J)=xt(i_xtsort(end-k1+1:end));
nikcleju@4 75 I=J;
nikcleju@4 76 if norm(Y-X*x1)/norm(Y)<tol
nikcleju@4 77 break
nikcleju@4 78 end
nikcleju@4 79
nikcleju@4 80 end
nikcleju@4 81
nikcleju@4 82 beta=x1;