Daniel@0: function c = mirclassify(a,da,t,dt,varargin) Daniel@0: % c = mirclassify(test,features_test,train,features_train) classifies the Daniel@0: % audio sequence(s) contained in the audio object test, along the Daniel@0: % analytic feature(s) features_test, following the supervised Daniel@0: % learning of a training set defined by the audio object train and Daniel@0: % the corresponding analytic feature(s) features_train. Daniel@0: % * The analytic feature(s) features_test should *not* be frame Daniel@0: % decomposed. Frame-decomposed data should first be Daniel@0: % summarized, using for instance mirmean or mirstd. Daniel@0: % * Multiple analytic features have to be grouped into one array Daniel@0: % of cells. Daniel@0: % You can also integrate your own arrays of numbers computed outside Daniel@0: % MIRtoolbox as part of the features. These arrays should be Daniel@0: % given as matrices where each successive column is the analysis Daniel@0: % of each successive file. Daniel@0: % Example: Daniel@0: % mirclassify(test, mfcc(test), train, mfcc(train)) Daniel@0: % mirclassify(test, {mfcc(test), centroid(test)}, ... Daniel@0: % train, {mfcc(train), centroid(train)}) Daniel@0: % Optional argument: Daniel@0: % mirclassify(...,'Nearest') uses the minimum distance strategy. Daniel@0: % (by default) Daniel@0: % mirclassify(...,'Nearest',k) uses the k-nearest-neighbour strategy. Daniel@0: % Default value: k = 1, corresponding to the minimum distance Daniel@0: % strategy. Daniel@0: % mirclassify(...,'GMM',ng) uses a gaussian mixture model. Each class is Daniel@0: % modeled by at most ng gaussians. Daniel@0: % Default value: ng = 1. Daniel@0: % Additionnally, the type of mixture model can be specified, Daniel@0: % using the set of value proposed in the gmm function: i.e., Daniel@0: % 'spherical','diag','full' (default value) and 'ppca'. Daniel@0: % (cf. help gmm) Daniel@0: % Requires the Netlab toolbox. Daniel@0: Daniel@0: lab = get(t,'Label'); Daniel@0: c.labtraining = lab; Daniel@0: rlab = get(a,'Label'); Daniel@0: c.labtest = rlab; Daniel@0: [k,ncentres,covartype,kmiter,emiter,d,norml,mahl] = scanargin(varargin); Daniel@0: disp('Classifying...') Daniel@0: if not(iscell(dt)) Daniel@0: dt = {dt}; Daniel@0: end Daniel@0: lvt = length(get(t,'Data')); Daniel@0: vt = []; Daniel@0: for i = 1:length(dt) Daniel@0: if isnumeric(dt{i}) Daniel@0: d = cell(1,size(dt{i},2)); Daniel@0: for j = 1:size(dt{i},2) Daniel@0: d{j} = dt{i}(:,j); Daniel@0: end Daniel@0: else Daniel@0: d = get(dt{i},'Data'); Daniel@0: end Daniel@0: vt = integrate(vt,d,lvt,norml); Daniel@0: if isa(dt{i},'scalar') Daniel@0: m = mode(dt{i}); Daniel@0: if not(isempty(m)) Daniel@0: vt = integrate(vt,m,lvt,norml); Daniel@0: end Daniel@0: end Daniel@0: end Daniel@0: c.training = vt; Daniel@0: dim = size(vt,1); Daniel@0: if not(iscell(da)) Daniel@0: da = {da}; Daniel@0: end Daniel@0: lva = length(get(a,'Data')); Daniel@0: va = []; Daniel@0: for i = 1:length(da) Daniel@0: if isnumeric(da{i}) Daniel@0: d = cell(1,size(da{i},2)); Daniel@0: for j = 1:size(da{i},2) Daniel@0: d{j} = da{i}(:,j); Daniel@0: end Daniel@0: else Daniel@0: d = get(da{i},'Data'); Daniel@0: end Daniel@0: va = integrate(va,d,lva,norml); Daniel@0: if isa(da{i},'scalar') Daniel@0: m = mode(da{i}); Daniel@0: if not(isempty(m)) Daniel@0: va = integrate(va,m,lva,norml); Daniel@0: end Daniel@0: end Daniel@0: end Daniel@0: c.test = va; Daniel@0: c.nbobs = lvt; Daniel@0: totva = [vt va]; Daniel@0: mahl = cov(totva'); Daniel@0: if k % k-Nearest Neighbour Daniel@0: c.nbparam = lvt; Daniel@0: for l = 1:lva Daniel@0: [sv,idx] = sort(distance(va(:,l),vt,d,mahl)); Daniel@0: labs = cell(0); % Class labels Daniel@0: founds = []; % Number of found elements in each class Daniel@0: for i = idx(1:k) Daniel@0: labi = lab{i}; Daniel@0: found = 0; Daniel@0: for j = 1:length(labs) Daniel@0: if isequal(labi,labs{j}) Daniel@0: found = j; Daniel@0: end Daniel@0: end Daniel@0: if found Daniel@0: founds(found) = founds(found)+1; Daniel@0: else Daniel@0: labs{end+1} = labi; Daniel@0: founds(end+1) = 1; Daniel@0: end Daniel@0: end Daniel@0: [b ib] = max(founds); Daniel@0: c.classes{l} = labs{ib}; Daniel@0: end Daniel@0: elseif ncentres % Gaussian Mixture Model Daniel@0: labs = cell(0); % Class labels Daniel@0: founds = cell(0); % Elements associated to each label. Daniel@0: for i = 1:lvt Daniel@0: labi = lab{i}; Daniel@0: found = 0; Daniel@0: for j = 1:length(labs) Daniel@0: if isequal(labi,labs{j}) Daniel@0: founds{j}(end+1) = i; Daniel@0: found = 1; Daniel@0: end Daniel@0: end Daniel@0: if not(found) Daniel@0: labs{end+1} = labi; Daniel@0: founds{end+1} = i; Daniel@0: end Daniel@0: end Daniel@0: options = zeros(1, 18); Daniel@0: options(2:3) = 1e-4; Daniel@0: options(4) = 1e-6; Daniel@0: options(16) = 1e-8; Daniel@0: options(17) = 0.1; Daniel@0: options(1) = 0; %Prints out error values, -1 else Daniel@0: c.nbparam = 0; Daniel@0: OK = 0; Daniel@0: while not(OK) Daniel@0: OK = 1; Daniel@0: for i = 1:length(labs) Daniel@0: options(14) = kmiter; Daniel@0: try Daniel@0: mix{i} = gmm(dim,ncentres,covartype); Daniel@0: catch Daniel@0: error('ERROR IN CLASSIFY: Netlab toolbox not installed.'); Daniel@0: end Daniel@0: mix{i} = netlabgmminit(mix{i},vt(:,founds{i})',options); Daniel@0: options(5) = 1; Daniel@0: options(14) = emiter; Daniel@0: try Daniel@0: mix{i} = gmmem(mix{i},vt(:,founds{i})',options); Daniel@0: c.nbparam = c.nbparam + ... Daniel@0: length(mix{i}.centres(:)) + length(mix{i}.covars(:)); Daniel@0: catch Daniel@0: err = lasterr; Daniel@0: warning('WARNING IN CLASSIFY: Problem when calling GMMEM:'); Daniel@0: disp(err); Daniel@0: disp('Let us try again...'); Daniel@0: OK = 0; Daniel@0: end Daniel@0: end Daniel@0: end Daniel@0: pr = zeros(lva,length(labs)); Daniel@0: for i = 1:length(labs) Daniel@0: prior = length(founds{i})/lvt; Daniel@0: pr(:,i) = prior * gmmprob(mix{i},va'); Daniel@0: %c.post{i} = gmmpost(mix{i},va'); Daniel@0: end Daniel@0: [mm ib] = max(pr'); Daniel@0: for i = 1:lva Daniel@0: c.classes{i} = labs{ib(i)}; Daniel@0: end Daniel@0: end Daniel@0: if isempty(rlab) Daniel@0: c.correct = NaN; Daniel@0: else Daniel@0: correct = 0; Daniel@0: for i = 1:lva Daniel@0: if isequal(c.classes{i},rlab{i}) Daniel@0: correct = correct + 1; Daniel@0: end Daniel@0: end Daniel@0: c.correct = correct / lva; Daniel@0: end Daniel@0: c = class(c,'mirclassify'); Daniel@0: Daniel@0: Daniel@0: function vt = integrate(vt,v,lvt,norml) Daniel@0: vtl = []; Daniel@0: for l = 1:lvt Daniel@0: vl = v{l}; Daniel@0: if iscell(vl) Daniel@0: vl = vl{1}; Daniel@0: end Daniel@0: if iscell(vl) Daniel@0: vl = vl{1}; Daniel@0: end Daniel@0: if size(vl,2) > 1 Daniel@0: mirerror('MIRCLASSIFY','The analytic features guiding the classification should not be frame-decomposed.'); Daniel@0: end Daniel@0: vtl(:,l) = vl; Daniel@0: end Daniel@0: if norml Daniel@0: dnom = repmat(std(vtl,0,2),[1 size(vtl,2)]); Daniel@0: dnom = dnom + (dnom == 0); % In order to avoid division by 0 Daniel@0: vtl = (vtl - repmat(mean(vtl,2),[1 size(vtl,2)])) ./ dnom; Daniel@0: end Daniel@0: vt(end+1:end+size(vtl,1),:) = vtl; Daniel@0: Daniel@0: Daniel@0: function [k,ncentres,covartype,kmiter,emiter,d,norml,mahl] = scanargin(v) Daniel@0: k = 1; Daniel@0: d = 0; Daniel@0: i = 1; Daniel@0: ncentres = 0; Daniel@0: covartype = 'full'; Daniel@0: kmiter = 10; Daniel@0: emiter = 100; Daniel@0: norml = 1; Daniel@0: mahl = 1; Daniel@0: while i <= length(v) Daniel@0: arg = v{i}; Daniel@0: if ischar(arg) && strcmpi(arg,'Nearest') Daniel@0: k = 1; Daniel@0: if length(v)>i && isnumeric(v{i+1}) Daniel@0: i = i+1; Daniel@0: k = v{i}; Daniel@0: end Daniel@0: elseif ischar(arg) && strcmpi(arg,'GMM') Daniel@0: k = 0; Daniel@0: ncentres = 1; Daniel@0: if length(v)>i Daniel@0: if isnumeric(v{i+1}) Daniel@0: i = i+1; Daniel@0: ncentres = v{i}; Daniel@0: if length(v)>i && ischar(v{i+1}) Daniel@0: i = i+1; Daniel@0: covartype = v{i}; Daniel@0: end Daniel@0: elseif ischar(v{i+1}) Daniel@0: i = i+1; Daniel@0: covartype = v{i}; Daniel@0: if length(v)>i && isnumeric(v{i+1}) Daniel@0: i = i+1; Daniel@0: ncentres = v{i}; Daniel@0: end Daniel@0: end Daniel@0: end Daniel@0: elseif isnumeric(arg) Daniel@0: k = v{i}; Daniel@0: else Daniel@0: error('ERROR IN MIRCLASSIFY: Syntax error. See help mirclassify.'); Daniel@0: end Daniel@0: i = i+1; Daniel@0: end Daniel@0: Daniel@0: Daniel@0: function y = distance(a,t,d,mahl) Daniel@0: Daniel@0: for i = 1:size(t,2) Daniel@0: if det(mahl) > 0 % more generally, uses cond Daniel@0: lham = inv(mahl); Daniel@0: else Daniel@0: lham = pinv(mahl); Daniel@0: end Daniel@0: y(i) = sqrt((a - t(:,i))'*lham*(a - t(:,i))); Daniel@0: end Daniel@0: %y = sqrt(sum(repmat(a,[1,size(t,2)])-t,1).^2);