Daniel@0: % Check sampling on a mixture of experts model Daniel@0: % Daniel@0: % X \ Daniel@0: % | | Daniel@0: % Q | Daniel@0: % | / Daniel@0: % Y Daniel@0: % Daniel@0: % where all arcs point down. Daniel@0: % We condition everything on X, so X is a root node. Q is a softmax, and Y is a linear Gaussian. Daniel@0: % Q is hidden, X and Y are observed. Daniel@0: Daniel@0: X = 1; Daniel@0: Q = 2; Daniel@0: Y = 3; Daniel@0: dag = zeros(3,3); Daniel@0: dag(X,[Q Y]) = 1; Daniel@0: dag(Q,Y) = 1; Daniel@0: ns = [1 2 2]; Daniel@0: dnodes = [2]; Daniel@0: bnet = mk_bnet(dag, ns, dnodes); Daniel@0: Daniel@0: x = 0.5; Daniel@0: bnet.CPD{1} = root_CPD(bnet, 1, x); Daniel@0: bnet.CPD{2} = softmax_CPD(bnet, 2); Daniel@0: bnet.CPD{3} = gaussian_CPD(bnet, 3); Daniel@0: Daniel@0: data_case = sample_bnet(bnet, 'evidence', {0.8, [], []}) Daniel@0: ll = log_lik_complete(bnet, data_case) Daniel@0: Daniel@0: data_case = sample_bnet(bnet, 'evidence', {-11, [], []}) Daniel@0: ll = log_lik_complete(bnet, data_case) Daniel@0: Daniel@0: