Mercurial > hg > camir-aes2014
comparison toolboxes/FullBNT-1.0.7/bnt/examples/dynamic/reveal1.m @ 0:e9a9cd732c1e tip
first hg version after svn
author | wolffd |
---|---|
date | Tue, 10 Feb 2015 15:05:51 +0000 |
parents | |
children |
comparison
equal
deleted
inserted
replaced
-1:000000000000 | 0:e9a9cd732c1e |
---|---|
1 % Make a DBN with the following inter-connectivity matrix | |
2 % 1 | |
3 % / \ | |
4 % 2 3 | |
5 % \ / | |
6 % 4 | |
7 % | | |
8 % 5 | |
9 % where all arcs point down. In addition, there are persistence arcs from each node to itself. | |
10 % There are no intra-slice connections. | |
11 % Nodes have noisy-or CPDs. | |
12 % Node 1 turns on spontaneously due to its leaky source. | |
13 % This effect trickles down to the other nodes in the order shown. | |
14 % All the other nodes inhibit their leaks. | |
15 % None of the nodes inhibit the connection from themselves, so that once they are on, they remain | |
16 % on (persistence). | |
17 % | |
18 % This model was used in the experiments reported in | |
19 % - "Learning the structure of DBNs", Friedman, Murphy and Russell, UAI 1998. | |
20 % where the structure was learned even in the presence of missing data. | |
21 % In that paper, we used the structural EM algorithm. | |
22 % Here, we assume full observability and tabular CPDs for the learner, so we can use a much | |
23 % simpler learning algorithm. | |
24 | |
25 ss = 5; | |
26 | |
27 inter = eye(ss); | |
28 inter(1,[2 3]) = 1; | |
29 inter(2,4)=1; | |
30 inter(3,4)=1; | |
31 inter(4,5)=1; | |
32 | |
33 intra = zeros(ss); | |
34 ns = 2*ones(1,ss); | |
35 | |
36 bnet = mk_dbn(intra, inter, ns); | |
37 | |
38 % All nodes start out off | |
39 for i=1:ss | |
40 bnet.CPD{i} = tabular_CPD(bnet, i, [1.0 0.0]'); | |
41 end | |
42 | |
43 % The following params correspond to Fig 4a in the UAI 98 paper | |
44 % The first arg is the leak inhibition prob. | |
45 % The vector contains the inhib probs from the parents in the previous slice; | |
46 % the last element is self, which is never inhibited. | |
47 bnet.CPD{1+ss} = noisyor_CPD(bnet, 1+ss, 0.8, 0); | |
48 bnet.CPD{2+ss} = noisyor_CPD(bnet, 2+ss, 1, [0.9 0]); | |
49 bnet.CPD{3+ss} = noisyor_CPD(bnet, 3+ss, 1, [0.8 0]); | |
50 bnet.CPD{4+ss} = noisyor_CPD(bnet, 4+ss, 1, [0.7 0.6 0]); | |
51 bnet.CPD{5+ss} = noisyor_CPD(bnet, 5+ss, 1, [0.5 0]); | |
52 | |
53 | |
54 % Generate some training data | |
55 | |
56 nseqs = 20; | |
57 seqs = cell(1,nseqs); | |
58 T = 30; | |
59 for i=1:nseqs | |
60 seqs{i} = sample_dbn(bnet, T); | |
61 end | |
62 | |
63 max_fan_in = 3; % let's cheat a little here | |
64 | |
65 % computing num. incorrect edges as a fn of the size of the training set | |
66 %sz = [5 10 15 20]; | |
67 sz = [5 10]; | |
68 h = zeros(1, length(sz)); | |
69 for i=1:length(sz) | |
70 inter2 = learn_struct_dbn_reveal(seqs(1:sz(i)), ns, max_fan_in); | |
71 h(i) = sum(abs(inter(:)-inter2(:))); % hamming distance | |
72 end | |
73 h |