Mercurial > hg > smallbox
diff DL/Majorization Minimization DL/mm1.m @ 155:b14209313ba4 ivand_dev
Integration of Majorization Minimisation Dictionary Learning
author | Ivan Damnjanovic lnx <ivan.damnjanovic@eecs.qmul.ac.uk> |
---|---|
date | Mon, 22 Aug 2011 11:46:35 +0100 |
parents | |
children |
line wrap: on
line diff
--- /dev/null Thu Jan 01 00:00:00 1970 +0000 +++ b/DL/Majorization Minimization DL/mm1.m Mon Aug 22 11:46:35 2011 +0100 @@ -0,0 +1,50 @@ +function [unhat,er] = mm1(Phi,x,u0,to,lambda,maxIT,eps,map) +%% Iterative Soft Thresholding (with optional debiasing) +% +% Phi = Normalized Dictionary +% x = Signal(x). This can be a vector or a matrix +% u0 = Initial guess for the coefficients +% to = 1/(step size) . It is larger than spectral norm of dictionary Phi +% lambda = Lagrangian multiplier. (regulates shrinkage) +% eps = Stopping criterion for iterative softthresholding and MM dictionary update +% map = Debiasing. 0 = No, 1 = Yes +% unhat = Updated coefficients +% er = Objective cost +%% +cont = 1; +in = 1; +% un = zeros(size(u0,1),size(u0,2)); +un = u0; +c1 = (1/to^2)*Phi'*x; +c2 = (1/to^2)*(Phi'*Phi); +%%%% +while (cont && (in<=maxIT)) + unold = un; + %%%%%% Soft Thresholding %%%%%%% + alphap = (un + c1 - c2*un); + un = (alphap-(lambda/(2*to^2))*sign(alphap)).*(abs(alphap)>=(lambda/(2*to^2))); + in = in+1; + cont = sum(sum((unold-un).^2))>eps; +end +%%%%%%%%%% +if map == 1, + %% Mapping on the selected space %%%% + [uN,uM] = size(un); + unhat = zeros(uN,uM); + for l = 1:uM, + unz = (abs(un(:,l))>0); + M = diag(unz); + PhiNew = Phi*M; + PhiS = PhiNew(:,unz); + unt = inv(PhiS'*PhiS+.0001*eye(sum(unz)))*PhiS'*x(:,l); + unhat(unz,l) = unt; + end +else + unhat = un; +end +%%% Cost function calculation +if map == 1, + er = sum(sum((Phi*unhat-x).^2))+lambda*(sum(sum(abs(unhat)>0))); %% l_0 Cost function +else + er = sum(sum((Phi*unhat-x).^2))+lambda*(sum(sum(abs(unhat)))); +end