Mercurial > hg > smallbox
annotate DL/Majorization Minimization DL/ExactDicoRecovery/ksvd_cn.m @ 160:e3035d45d014 danieleb
Added support classes
author | Daniele Barchiesi <daniele.barchiesi@eecs.qmul.ac.uk> |
---|---|
date | Wed, 31 Aug 2011 10:53:10 +0100 |
parents | b14209313ba4 |
children |
rev | line source |
---|---|
ivan@155 | 1 % K-SVD algorithm for Dictionary Learning |
ivan@155 | 2 % Y = input data (M X L matrix) |
ivan@155 | 3 % Phi = initial dictionary (M X N), e.g. random dictionary or first N data samples |
ivan@155 | 4 % lambda = regularization coefficient (||Phi*X-Y||_F)^2 + lambda*||X||_1 |
ivan@155 | 5 % IT = number of iterations |
ivan@155 | 6 function [Phiout,X,ert] = ksvd_cn(Y,Phi,lambda,IT) |
ivan@155 | 7 maxIT = 1000; |
ivan@155 | 8 [PhiN,PhiM] = size(Phi); |
ivan@155 | 9 RR1 = PhiM; |
ivan@155 | 10 %%%%%%%%%%%%%% |
ivan@155 | 11 % [PhiM,L] = size(ud); |
ivan@155 | 12 [PhiN,L] = size(Y); |
ivan@155 | 13 X = ones(PhiM,L); |
ivan@155 | 14 for it = 1:IT |
ivan@155 | 15 to = .1+svds(Phi,1); |
ivan@155 | 16 [PhiN,PhiM] = size(Phi); |
ivan@155 | 17 %%%% |
ivan@155 | 18 eps = 3*10^-4; |
ivan@155 | 19 map = 1; % Projecting on the selected space (0=no,1=yes) |
ivan@155 | 20 [X,l1err] = mm1(Phi,Y,X,to,lambda,maxIT,eps,map); %% Sparse approximation with Iterative Soft-thresholding |
ivan@155 | 21 ert(it) = l1err; |
ivan@155 | 22 %%% |
ivan@155 | 23 [Phi,X] = dict_update_KSVD_cn(Phi,Y,X); |
ivan@155 | 24 end |
ivan@155 | 25 Phiout = Phi; |