comparison toolboxes/FullBNT-1.0.7/docs/bnt.html @ 0:e9a9cd732c1e tip

first hg version after svn
author wolffd
date Tue, 10 Feb 2015 15:05:51 +0000
parents
children
comparison
equal deleted inserted replaced
-1:000000000000 0:e9a9cd732c1e
1 <html> <head>
2 <title>Bayes Net Toolbox for Matlab</title>
3 </head>
4
5 <body>
6 <!--<body bgcolor="#FFFFFF"> -->
7
8 <h1>Bayes Net Toolbox for Matlab</h1>
9 Written by Kevin Murphy, 1997--2002.
10 Last updated: 19 October 2007.
11
12 <P><P>
13 <table>
14 <tr>
15 <td>
16 <img align=left src="Figures/mathbymatlab.gif" alt="Matlab logo">
17 <!-- <img align=left src="toolbox.gif" alt="Toolbox logo">-->
18 <td>
19 <!--<center>-->
20 <a href="http://groups.yahoo.com/group/BayesNetToolbox/join">
21 <img src="http://groups.yahoo.com/img/ui/join.gif" border=0><br>
22 Click to subscribe to the BNT email list</a>
23 <br>
24 (<a href="http://groups.yahoo.com/group/BayesNetToolbox">
25 http://groups.yahoo.com/group/BayesNetToolbox</a>)
26 <!--</center>-->
27 </table>
28
29
30 <p>
31 <ul>
32 <li> <a href="changelog.html">Changelog</a>
33
34 <li> <a
35 href="http://www.cs.ubc.ca/~murphyk/Software/BNT/FullBNT-1.0.4.zip">Download
36 zip file</a>.
37
38 <li> <a href="install.html">Installation</a>
39
40 <li> <a href="license.gpl">Terms and conditions of use (GNU Library GPL)</a>
41
42
43 <li> <a href="usage.html">How to use the toolbox</a>
44
45 <li> <a href="whyNotSourceforge.html">Why I closed the sourceforge
46 site</a>.
47
48 <!--
49 <li> <a href="Talks/BNT_mathworks.ppt">Powerpoint slides on graphical models
50 and BNT</a>, presented to the Mathworks, June 2003
51
52
53 <li> <a href="Talks/gR03.ppt">Powerpoint slides on BNT and object
54 recognition</a>, presented at the <a
55 href="http://www.math.auc.dk/gr/gr2003.html">gR</a> workshop,
56 September 2003.
57 -->
58
59 <!--
60 <li> <a href="gR03.pdf">Proposed design for gR, a graphical models
61 toolkit in R</a>, September 2003.
62 (For more information on the gR project,
63 click <a href="http://www.r-project.org/gR/">here</a>.)
64 -->
65
66 <li>
67 <!--
68 <img src = "../new.gif" alt="new">
69 -->
70
71 <a href="../../Papers/bnt.pdf">Invited paper on BNT</a>,
72 published in
73 Computing Science and Statistics, 2001.
74
75 <li> <a href="../bnsoft.html">Other Bayes net software</a>
76
77 <!--<li> <a href="software.html">Other Matlab software</a>-->
78
79 <li> <a href="../../Bayes/bnintro.html">A brief introduction to
80 Bayesian Networks</a>
81
82
83 <li> <a href="#features">Major features</a>
84 <li> <a href="#models">Supported models</a>
85 <!--<li> <a href="#future">Future work</a>-->
86 <li> <a href="#give_away">Why do I give the code away?</a>
87 <li> <a href="#why_matlab">Why Matlab?</a>
88 <li> <a href="#ack">Acknowledgments</a>
89 </ul>
90 <p>
91
92
93
94 <h2><a name="features">Major features</h2>
95 <ul>
96
97 <li> BNT supports many types of
98 <b>conditional probability distributions</b> (nodes),
99 and it is easy to add more.
100 <ul>
101 <li>Tabular (multinomial)
102 <li>Gaussian
103 <li>Softmax (logistic/ sigmoid)
104 <li>Multi-layer perceptron (neural network)
105 <li>Noisy-or
106 <li>Deterministic
107 </ul>
108 <p>
109
110 <li> BNT supports <b>decision and utility nodes</b>, as well as chance
111 nodes,
112 i.e., influence diagrams as well as Bayes nets.
113 <p>
114
115 <li> BNT supports static and dynamic BNs (useful for modelling dynamical systems
116 and sequence data).
117 <p>
118
119 <li> BNT supports many different <b>inference algorithms</b>,
120 and it is easy to add more.
121
122 <ul>
123 <li> Exact inference for static BNs:
124 <ul>
125 <li>junction tree
126 <li>variable elimination
127 <li>brute force enumeration (for discrete nets)
128 <li>linear algebra (for Gaussian nets)
129 <li>Pearl's algorithm (for polytrees)
130 <li>quickscore (for QMR)
131 </ul>
132
133 <p>
134 <li> Approximate inference for static BNs:
135 <ul>
136 <li>likelihood weighting
137 <li> Gibbs sampling
138 <li>loopy belief propagation
139 </ul>
140
141 <p>
142 <li> Exact inference for DBNs:
143 <ul>
144 <li>junction tree
145 <li>frontier algorithm
146 <li>forwards-backwards (for HMMs)
147 <li>Kalman-RTS (for LDSs)
148 </ul>
149
150 <p>
151 <li> Approximate inference for DBNs:
152 <ul>
153 <li>Boyen-Koller
154 <li>factored-frontier/loopy belief propagation
155 </ul>
156
157 </ul>
158 <p>
159
160 <li>
161 BNT supports several methods for <b>parameter learning</b>,
162 and it is easy to add more.
163 <ul>
164
165 <li> Batch MLE/MAP parameter learning using EM.
166 (Each node type has its own M method, e.g. softmax nodes use IRLS,<br>
167 and each inference engine has its own E method, so the code is fully modular.)
168
169 <li> Sequential/batch Bayesian parameter learning (for fully observed tabular nodes only).
170 </ul>
171
172
173 <p>
174 <li>
175 BNT supports several methods for <b>regularization</b>,
176 and it is easy to add more.
177 <ul>
178 <li> Any node can have its parameters clamped (made non-adjustable).
179 <li> Any set of compatible nodes can have their parameters tied (c.f.,
180 weight sharing in a neural net).
181 <li> Some node types (e.g., tabular) supports priors for MAP estimation.
182 <li> Gaussian covariance matrices can be declared full or diagonal, and can
183 be tied across states of their discrete parents (if any).
184 </ul>
185
186 <p>
187 <li>
188 BNT supports several methods for <b>structure learning</b>,
189 and it is easy to add more.
190 <ul>
191
192 <li> Bayesian structure learning,
193 using MCMC or local search (for fully observed tabular nodes only).
194
195 <li> Constraint-based structure learning (IC/PC and IC*/FCI).
196 </ul>
197
198
199 <p>
200 <li> The source code is extensively documented, object-oriented, and free, making it
201 an excellent tool for teaching, research and rapid prototyping.
202
203 </ul>
204
205
206
207 <h2><a name="models">Supported probabilistic models</h2>
208 <p>
209 It is trivial to implement all of
210 the following probabilistic models using the toolbox.
211 <ul>
212 <li>Static
213 <ul>
214 <li> Linear regression, logistic regression, hierarchical mixtures of experts
215
216 <li> Naive Bayes classifiers, mixtures of Gaussians,
217 sigmoid belief nets
218
219 <li> Factor analysis, probabilistic
220 PCA, probabilistic ICA, mixtures of these models
221
222 </ul>
223
224 <li>Dynamic
225 <ul>
226
227 <li> HMMs, Factorial HMMs, coupled HMMs, input-output HMMs, DBNs
228
229 <li> Kalman filters, ARMAX models, switching Kalman filters,
230 tree-structured Kalman filters, multiscale AR models
231
232 </ul>
233
234 <li> Many other combinations, for which there are (as yet) no names!
235
236 </ul>
237
238
239 <!--
240 <h2><a name="future">Future work</h2>
241
242 I have a long <a href="wish.txt">wish list</a>
243 of features I would like to add to BNT
244 at some point in the future.
245 Please email me (<a
246 href="mailto:murphyk@cs.berkeley.edu">murphyk@cs.berkeley.edu</a>)
247 if you are interested in contributing!
248 -->
249
250
251
252 <h2><a name="give_away">Why do I give the code away?</h2>
253
254 <ul>
255
256 <li>
257 I was hoping for a Linux-style effect, whereby people would contribute
258 their own Matlab code so that the package would grow. With a few
259 exceptions, this has not happened,
260 although several people have provided bug-fixes (see the <a
261 href="#ack">acknowledgements</a>).
262 Perhaps the <a
263 href="http://www.cs.berkeley.edu/~murphyk/OpenBayes/index.html">Open
264 Bayes Project</a> will be more
265 succesful in this regard, although the evidence to date is not promising.
266
267 <p>
268 <li>
269 Knowing that someone else might read your code forces one to
270 document it properly, a good practice in any case, as anyone knows who
271 has revisited old code.
272 In addition, by having many "eye balls", it is easier to spot bugs.
273
274
275 <p>
276 <li>
277 I believe in the concept of
278 <a href="http://www-stat.stanford.edu/~donoho/Reports/1995/wavelab.pdf">
279 reproducible research</a>.
280 Good science requires that other people be able
281 to replicate your experiments.
282 Often a paper does not give enough details about how exactly an
283 algorithm was implemented (e.g., how were the parameters chosen? what
284 initial conditions were used?), and these can make a big difference in
285 practice.
286 Hence one should release the code that
287 was actually used to generate the results in one's paper.
288 This also prevents re-inventing the wheel.
289
290 <p>
291 <li>
292 I was fed up with reading papers where all people do is figure out how
293 to do exact inference and/or learning
294 in a model which is just a trivial special case of a general Bayes net, e.g.,
295 input-output HMMs, coupled-HMMs, auto-regressive HMMs.
296 My hope is that, by releasing general purpose software, the field can
297 move on to more interesting questions.
298 As Alfred North Whitehead said in 1911,
299 "Civilization advances by extending the number of important operations
300 that we can do without thinking about them."
301
302 </ul>
303
304
305
306
307
308 <h2><a name="why_matlab">Why Matlab?</h2>
309
310 Matlab is an interactive, matrix-oriented programming language that
311 enables one to express one's (mathematical) ideas very concisely and directly,
312 without having to worry about annoying details like memory allocation
313 or type checking. This considerably reduces development time and
314 keeps code short, readable and fully portable.
315 Matlab has excellent built-in support for many data analysis and
316 visualization routines. In addition, there are many useful toolboxes, e.g., for
317 neural networks, signal and image processing.
318 The main disadvantages of Matlab are that it can be slow (which is why
319 we are currently rewriting parts of BNT in C), and that the commercial
320 license is expensive (although the student version is only $100 in the US).
321 <p>
322 Many people ask me why I did not use
323 <a href="http://www.octave.org/">Octave</a>,
324 an open-source Matlab clone.
325 The reason is that
326 Octave does not support multi-dimensional arrays,
327 cell arrays, objects, etc.
328 <p>
329 Click <a href="../which_language.html">here</a> for a more detailed
330 comparison of matlab and other languages.
331
332
333
334 <h2><a name="ack">Acknowledgments</h2>
335
336 I would like to thank numerous people for bug fixes, including:
337 Rainer Deventer, Michael Robert James, Philippe Leray, Pedrito Maynard-Reid II, Andrew Ng,
338 Ron Parr, Ilya Shpitser, Xuejing Sun, Ursula Sondhauss.
339 <p>
340 I would like to thank the following people for contributing code:
341 Pierpaolo Brutti, Ali Taylan Cemgil, Tamar Kushnir,
342 Tom Murray,
343 Nicholas Saunier,
344 Ken Shan,
345 Yair Weiss,
346 Bob Welch,
347 Ron Zohar.
348 <p>
349 The following Intel employees have also contributed code:
350 Qian Diao, Shan Huang, Yimin Zhang and especially Wei Hu.
351
352 <p>
353 I would like to thank Stuart Russell for funding me over the years as
354 I developed BNT, and Gary Bradksi for hiring me as an intern at Intel,
355 which has supported much of the recent developments of BNT.
356
357
358 </body>
359