comparison toolboxes/FullBNT-1.0.7/docs/bnt_pre_sf.html @ 0:e9a9cd732c1e tip

first hg version after svn
author wolffd
date Tue, 10 Feb 2015 15:05:51 +0000
parents
children
comparison
equal deleted inserted replaced
-1:000000000000 0:e9a9cd732c1e
1 <html> <head>
2 <title>Bayes Net Toolbox for Matlab</title>
3 </head>
4
5 <body>
6 <!--<body bgcolor="#FFFFFF"> -->
7
8 <h1>Bayes Net Toolbox for Matlab</h1>
9 Written by Kevin Murphy.
10 <br>
11 <b>BNT is now available from <a href="http://bnt.sourceforge.net/">sourceforge</a>!</b>
12 <!--
13 Last updated on 9 June 2004 (<a href="changelog.html">Detailed
14 changelog</a>)
15 -->
16 <p>
17
18 <P><P>
19 <table>
20 <tr>
21 <td>
22 <img align=left src="Figures/mathbymatlab.gif" alt="Matlab logo">
23 <!-- <img align=left src="toolbox.gif" alt="Toolbox logo">-->
24 <td>
25 <!--<center>-->
26 <a href="http://groups.yahoo.com/group/BayesNetToolbox/join">
27 <img src="http://groups.yahoo.com/img/ui/join.gif" border=0><br>
28 Click to subscribe to the BNT email list</a>
29 <br>
30 (<a href="http://groups.yahoo.com/group/BayesNetToolbox">
31 http://groups.yahoo.com/group/BayesNetToolbox</a>)
32 <!--</center>-->
33 </table>
34
35
36 <p>
37 <ul>
38 <!--<li> <a href="bnt_download.html">Download toolbox</a>-->
39 <li> Download BNT from <a href="http://bnt.sourceforge.net/">BNT sourceforge site</a>
40
41 <li> <a href="license.gpl">Terms and conditions of use (GNU Library GPL)</a>
42
43 <li> <a href="usage.html">How to use the toolbox</a>
44
45
46 <li> <a href="Talks/BNT_mathworks.ppt">Powerpoint slides on graphical models
47 and BNT</a>, presented to the Mathworks, June 2003
48
49 <!--
50 <li> <a href="Talks/gR03.ppt">Powerpoint slides on BNT and object
51 recognition</a>, presented at the <a
52 href="http://www.math.auc.dk/gr/gr2003.html">gR</a> workshop,
53 September 2003.
54 -->
55
56 <li> <a href="gR03.pdf">Proposed design for gR, a graphical models
57 toolkit in R</a>, September 2003.
58 <!--
59 (For more information on the gR project,
60 click <a href="http://www.r-project.org/gR/">here</a>.)
61 -->
62
63 <li>
64 <!--
65 <img src = "../new.gif" alt="new">
66 -->
67 <a href="../../Papers/bnt.pdf">Invited paper on BNT</a>,
68 published in
69 Computing Science and Statistics, 2001.
70
71 <li> <a href="bnsoft.html">Other Bayes net software</a>
72
73 <!--<li> <a href="software.html">Other Matlab software</a>-->
74
75 <li> <a href="../../Bayes/bnintro.html">A brief introduction to
76 Bayesian Networks</a>
77
78 <li> <a href="#features">Major features</a>
79 <li> <a href="#models">Supported models</a>
80 <!--<li> <a href="#future">Future work</a>-->
81 <li> <a href="#give_away">Why do I give the code away?</a>
82 <li> <a href="#why_matlab">Why Matlab?</a>
83 <li> <a href="#ack">Acknowledgments</a>
84 </ul>
85 <p>
86
87
88
89 <h2><a name="features">Major features</h2>
90 <ul>
91
92 <li> BNT supports many types of
93 <b>conditional probability distributions</b> (nodes),
94 and it is easy to add more.
95 <ul>
96 <li>Tabular (multinomial)
97 <li>Gaussian
98 <li>Softmax (logistic/ sigmoid)
99 <li>Multi-layer perceptron (neural network)
100 <li>Noisy-or
101 <li>Deterministic
102 </ul>
103 <p>
104
105 <li> BNT supports <b>decision and utility nodes</b>, as well as chance
106 nodes,
107 i.e., influence diagrams as well as Bayes nets.
108 <p>
109
110 <li> BNT supports static and dynamic BNs (useful for modelling dynamical systems
111 and sequence data).
112 <p>
113
114 <li> BNT supports many different <b>inference algorithms</b>,
115 and it is easy to add more.
116
117 <ul>
118 <li> Exact inference for static BNs:
119 <ul>
120 <li>junction tree
121 <li>variable elimination
122 <li>brute force enumeration (for discrete nets)
123 <li>linear algebra (for Gaussian nets)
124 <li>Pearl's algorithm (for polytrees)
125 <li>quickscore (for QMR)
126 </ul>
127
128 <p>
129 <li> Approximate inference for static BNs:
130 <ul>
131 <li>likelihood weighting
132 <li> Gibbs sampling
133 <li>loopy belief propagation
134 </ul>
135
136 <p>
137 <li> Exact inference for DBNs:
138 <ul>
139 <li>junction tree
140 <li>frontier algorithm
141 <li>forwards-backwards (for HMMs)
142 <li>Kalman-RTS (for LDSs)
143 </ul>
144
145 <p>
146 <li> Approximate inference for DBNs:
147 <ul>
148 <li>Boyen-Koller
149 <li>factored-frontier/loopy belief propagation
150 </ul>
151
152 </ul>
153 <p>
154
155 <li>
156 BNT supports several methods for <b>parameter learning</b>,
157 and it is easy to add more.
158 <ul>
159
160 <li> Batch MLE/MAP parameter learning using EM.
161 (Each node type has its own M method, e.g. softmax nodes use IRLS,<br>
162 and each inference engine has its own E method, so the code is fully modular.)
163
164 <li> Sequential/batch Bayesian parameter learning (for fully observed tabular nodes only).
165 </ul>
166
167
168 <p>
169 <li>
170 BNT supports several methods for <b>regularization</b>,
171 and it is easy to add more.
172 <ul>
173 <li> Any node can have its parameters clamped (made non-adjustable).
174 <li> Any set of compatible nodes can have their parameters tied (c.f.,
175 weight sharing in a neural net).
176 <li> Some node types (e.g., tabular) supports priors for MAP estimation.
177 <li> Gaussian covariance matrices can be declared full or diagonal, and can
178 be tied across states of their discrete parents (if any).
179 </ul>
180
181 <p>
182 <li>
183 BNT supports several methods for <b>structure learning</b>,
184 and it is easy to add more.
185 <ul>
186
187 <li> Bayesian structure learning,
188 using MCMC or local search (for fully observed tabular nodes only).
189
190 <li> Constraint-based structure learning (IC/PC and IC*/FCI).
191 </ul>
192
193
194 <p>
195 <li> The source code is extensively documented, object-oriented, and free, making it
196 an excellent tool for teaching, research and rapid prototyping.
197
198 </ul>
199
200
201
202 <h2><a name="models">Supported probabilistic models</h2>
203 <p>
204 It is trivial to implement all of
205 the following probabilistic models using the toolbox.
206 <ul>
207 <li>Static
208 <ul>
209 <li> Linear regression, logistic regression, hierarchical mixtures of experts
210
211 <li> Naive Bayes classifiers, mixtures of Gaussians,
212 sigmoid belief nets
213
214 <li> Factor analysis, probabilistic
215 PCA, probabilistic ICA, mixtures of these models
216
217 </ul>
218
219 <li>Dynamic
220 <ul>
221
222 <li> HMMs, Factorial HMMs, coupled HMMs, input-output HMMs, DBNs
223
224 <li> Kalman filters, ARMAX models, switching Kalman filters,
225 tree-structured Kalman filters, multiscale AR models
226
227 </ul>
228
229 <li> Many other combinations, for which there are (as yet) no names!
230
231 </ul>
232
233
234 <!--
235 <h2><a name="future">Future work</h2>
236
237 I have a long <a href="wish.txt">wish list</a>
238 of features I would like to add to BNT
239 at some point in the future.
240 Please email me (<a
241 href="mailto:murphyk@cs.berkeley.edu">murphyk@cs.berkeley.edu</a>)
242 if you are interested in contributing!
243 -->
244
245
246
247 <h2><a name="give_away">Why do I give the code away?</h2>
248
249 <ul>
250
251 <li>
252 I was hoping for a Linux-style effect, whereby people would contribute
253 their own Matlab code so that the package would grow. With a few
254 exceptions, this has not happened,
255 although several people have provided bug-fixes (see the <a
256 href="#ack">acknowledgements</a>).
257 Perhaps the <a
258 href="http://www.cs.berkeley.edu/~murphyk/OpenBayes/index.html">Open
259 Bayes Project</a> will be more
260 succesful in this regard, although the evidence to date is not promising.
261
262 <p>
263 <li>
264 Knowing that someone else might read your code forces one to
265 document it properly, a good practice in any case, as anyone knows who
266 has revisited old code.
267 In addition, by having many "eye balls", it is easier to spot bugs.
268
269
270 <p>
271 <li>
272 I believe in the concept of
273 <a href="http://www-stat.stanford.edu/~donoho/Reports/1995/wavelab.pdf">
274 reproducible research</a>.
275 Good science requires that other people be able
276 to replicate your experiments.
277 Often a paper does not give enough details about how exactly an
278 algorithm was implemented (e.g., how were the parameters chosen? what
279 initial conditions were used?), and these can make a big difference in
280 practice.
281 Hence one should release the code that
282 was actually used to generate the results in one's paper.
283 This also prevents re-inventing the wheel.
284
285 <p>
286 <li>
287 I was fed up with reading papers where all people do is figure out how
288 to do exact inference and/or learning
289 in a model which is just a trivial special case of a general Bayes net, e.g.,
290 input-output HMMs, coupled-HMMs, auto-regressive HMMs.
291 My hope is that, by releasing general purpose software, the field can
292 move on to more interesting questions.
293 As Alfred North Whitehead said in 1911,
294 "Civilization advances by extending the number of important operations
295 that we can do without thinking about them."
296
297 </ul>
298
299
300
301
302
303 <h2><a name="why_matlab">Why Matlab?</h2>
304
305 Matlab is an interactive, matrix-oriented programming language that
306 enables one to express one's (mathematical) ideas very concisely and directly,
307 without having to worry about annoying details like memory allocation
308 or type checking. This considerably reduces development time and
309 keeps code short, readable and fully portable.
310 Matlab has excellent built-in support for many data analysis and
311 visualization routines. In addition, there are many useful toolboxes, e.g., for
312 neural networks, signal and image processing.
313 The main disadvantages of Matlab are that it can be slow (which is why
314 we are currently rewriting parts of BNT in C), and that the commercial
315 license is expensive (although the student version is only $100 in the US).
316 <p>
317 Many people ask me why I did not use
318 <a href="http://www.octave.org/">Octave</a>,
319 an open-source Matlab clone.
320 The reason is that
321 Octave does not support multi-dimensional arrays,
322 cell arrays, objects, etc.
323 <p>
324 Click <a href="../which_language.html">here</a> for a more detailed
325 comparison of matlab and other languages.
326
327
328
329 <h2><a name="ack">Acknowledgments</h2>
330
331 I would like to thank numerous people for bug fixes, including:
332 Rainer Deventer, Michael Robert James, Philippe Leray, Pedrito Maynard-Reid II, Andrew Ng,
333 Ron Parr, Ilya Shpitser, Xuejing Sun, Ursula Sondhauss.
334 <p>
335 I would like to thank the following people for contributing code:
336 Pierpaolo Brutti, Ali Taylan Cemgil, Tamar Kushnir, Ken Shan,
337 <a href="http://www.cs.berkeley.edu/~yweiss">Yair Weiss</a>,
338 Ron Zohar.
339 <p>
340 The following Intel employees have also contributed code:
341 Qian Diao, Shan Huang, Yimin Zhang and especially Wei Hu.
342
343 <p>
344 I would like to thank Stuart Russell for funding me over the years as
345 I developed BNT, and Gary Bradksi for hiring me as an intern at Intel,
346 which has supported much of the recent developments of BNT.
347
348
349 </body>
350