annotate toolboxes/FullBNT-1.0.7/docs/bnt.html @ 0:e9a9cd732c1e tip

first hg version after svn
author wolffd
date Tue, 10 Feb 2015 15:05:51 +0000
parents
children
rev   line source
wolffd@0 1 <html> <head>
wolffd@0 2 <title>Bayes Net Toolbox for Matlab</title>
wolffd@0 3 </head>
wolffd@0 4
wolffd@0 5 <body>
wolffd@0 6 <!--<body bgcolor="#FFFFFF"> -->
wolffd@0 7
wolffd@0 8 <h1>Bayes Net Toolbox for Matlab</h1>
wolffd@0 9 Written by Kevin Murphy, 1997--2002.
wolffd@0 10 Last updated: 19 October 2007.
wolffd@0 11
wolffd@0 12 <P><P>
wolffd@0 13 <table>
wolffd@0 14 <tr>
wolffd@0 15 <td>
wolffd@0 16 <img align=left src="Figures/mathbymatlab.gif" alt="Matlab logo">
wolffd@0 17 <!-- <img align=left src="toolbox.gif" alt="Toolbox logo">-->
wolffd@0 18 <td>
wolffd@0 19 <!--<center>-->
wolffd@0 20 <a href="http://groups.yahoo.com/group/BayesNetToolbox/join">
wolffd@0 21 <img src="http://groups.yahoo.com/img/ui/join.gif" border=0><br>
wolffd@0 22 Click to subscribe to the BNT email list</a>
wolffd@0 23 <br>
wolffd@0 24 (<a href="http://groups.yahoo.com/group/BayesNetToolbox">
wolffd@0 25 http://groups.yahoo.com/group/BayesNetToolbox</a>)
wolffd@0 26 <!--</center>-->
wolffd@0 27 </table>
wolffd@0 28
wolffd@0 29
wolffd@0 30 <p>
wolffd@0 31 <ul>
wolffd@0 32 <li> <a href="changelog.html">Changelog</a>
wolffd@0 33
wolffd@0 34 <li> <a
wolffd@0 35 href="http://www.cs.ubc.ca/~murphyk/Software/BNT/FullBNT-1.0.4.zip">Download
wolffd@0 36 zip file</a>.
wolffd@0 37
wolffd@0 38 <li> <a href="install.html">Installation</a>
wolffd@0 39
wolffd@0 40 <li> <a href="license.gpl">Terms and conditions of use (GNU Library GPL)</a>
wolffd@0 41
wolffd@0 42
wolffd@0 43 <li> <a href="usage.html">How to use the toolbox</a>
wolffd@0 44
wolffd@0 45 <li> <a href="whyNotSourceforge.html">Why I closed the sourceforge
wolffd@0 46 site</a>.
wolffd@0 47
wolffd@0 48 <!--
wolffd@0 49 <li> <a href="Talks/BNT_mathworks.ppt">Powerpoint slides on graphical models
wolffd@0 50 and BNT</a>, presented to the Mathworks, June 2003
wolffd@0 51
wolffd@0 52
wolffd@0 53 <li> <a href="Talks/gR03.ppt">Powerpoint slides on BNT and object
wolffd@0 54 recognition</a>, presented at the <a
wolffd@0 55 href="http://www.math.auc.dk/gr/gr2003.html">gR</a> workshop,
wolffd@0 56 September 2003.
wolffd@0 57 -->
wolffd@0 58
wolffd@0 59 <!--
wolffd@0 60 <li> <a href="gR03.pdf">Proposed design for gR, a graphical models
wolffd@0 61 toolkit in R</a>, September 2003.
wolffd@0 62 (For more information on the gR project,
wolffd@0 63 click <a href="http://www.r-project.org/gR/">here</a>.)
wolffd@0 64 -->
wolffd@0 65
wolffd@0 66 <li>
wolffd@0 67 <!--
wolffd@0 68 <img src = "../new.gif" alt="new">
wolffd@0 69 -->
wolffd@0 70
wolffd@0 71 <a href="../../Papers/bnt.pdf">Invited paper on BNT</a>,
wolffd@0 72 published in
wolffd@0 73 Computing Science and Statistics, 2001.
wolffd@0 74
wolffd@0 75 <li> <a href="../bnsoft.html">Other Bayes net software</a>
wolffd@0 76
wolffd@0 77 <!--<li> <a href="software.html">Other Matlab software</a>-->
wolffd@0 78
wolffd@0 79 <li> <a href="../../Bayes/bnintro.html">A brief introduction to
wolffd@0 80 Bayesian Networks</a>
wolffd@0 81
wolffd@0 82
wolffd@0 83 <li> <a href="#features">Major features</a>
wolffd@0 84 <li> <a href="#models">Supported models</a>
wolffd@0 85 <!--<li> <a href="#future">Future work</a>-->
wolffd@0 86 <li> <a href="#give_away">Why do I give the code away?</a>
wolffd@0 87 <li> <a href="#why_matlab">Why Matlab?</a>
wolffd@0 88 <li> <a href="#ack">Acknowledgments</a>
wolffd@0 89 </ul>
wolffd@0 90 <p>
wolffd@0 91
wolffd@0 92
wolffd@0 93
wolffd@0 94 <h2><a name="features">Major features</h2>
wolffd@0 95 <ul>
wolffd@0 96
wolffd@0 97 <li> BNT supports many types of
wolffd@0 98 <b>conditional probability distributions</b> (nodes),
wolffd@0 99 and it is easy to add more.
wolffd@0 100 <ul>
wolffd@0 101 <li>Tabular (multinomial)
wolffd@0 102 <li>Gaussian
wolffd@0 103 <li>Softmax (logistic/ sigmoid)
wolffd@0 104 <li>Multi-layer perceptron (neural network)
wolffd@0 105 <li>Noisy-or
wolffd@0 106 <li>Deterministic
wolffd@0 107 </ul>
wolffd@0 108 <p>
wolffd@0 109
wolffd@0 110 <li> BNT supports <b>decision and utility nodes</b>, as well as chance
wolffd@0 111 nodes,
wolffd@0 112 i.e., influence diagrams as well as Bayes nets.
wolffd@0 113 <p>
wolffd@0 114
wolffd@0 115 <li> BNT supports static and dynamic BNs (useful for modelling dynamical systems
wolffd@0 116 and sequence data).
wolffd@0 117 <p>
wolffd@0 118
wolffd@0 119 <li> BNT supports many different <b>inference algorithms</b>,
wolffd@0 120 and it is easy to add more.
wolffd@0 121
wolffd@0 122 <ul>
wolffd@0 123 <li> Exact inference for static BNs:
wolffd@0 124 <ul>
wolffd@0 125 <li>junction tree
wolffd@0 126 <li>variable elimination
wolffd@0 127 <li>brute force enumeration (for discrete nets)
wolffd@0 128 <li>linear algebra (for Gaussian nets)
wolffd@0 129 <li>Pearl's algorithm (for polytrees)
wolffd@0 130 <li>quickscore (for QMR)
wolffd@0 131 </ul>
wolffd@0 132
wolffd@0 133 <p>
wolffd@0 134 <li> Approximate inference for static BNs:
wolffd@0 135 <ul>
wolffd@0 136 <li>likelihood weighting
wolffd@0 137 <li> Gibbs sampling
wolffd@0 138 <li>loopy belief propagation
wolffd@0 139 </ul>
wolffd@0 140
wolffd@0 141 <p>
wolffd@0 142 <li> Exact inference for DBNs:
wolffd@0 143 <ul>
wolffd@0 144 <li>junction tree
wolffd@0 145 <li>frontier algorithm
wolffd@0 146 <li>forwards-backwards (for HMMs)
wolffd@0 147 <li>Kalman-RTS (for LDSs)
wolffd@0 148 </ul>
wolffd@0 149
wolffd@0 150 <p>
wolffd@0 151 <li> Approximate inference for DBNs:
wolffd@0 152 <ul>
wolffd@0 153 <li>Boyen-Koller
wolffd@0 154 <li>factored-frontier/loopy belief propagation
wolffd@0 155 </ul>
wolffd@0 156
wolffd@0 157 </ul>
wolffd@0 158 <p>
wolffd@0 159
wolffd@0 160 <li>
wolffd@0 161 BNT supports several methods for <b>parameter learning</b>,
wolffd@0 162 and it is easy to add more.
wolffd@0 163 <ul>
wolffd@0 164
wolffd@0 165 <li> Batch MLE/MAP parameter learning using EM.
wolffd@0 166 (Each node type has its own M method, e.g. softmax nodes use IRLS,<br>
wolffd@0 167 and each inference engine has its own E method, so the code is fully modular.)
wolffd@0 168
wolffd@0 169 <li> Sequential/batch Bayesian parameter learning (for fully observed tabular nodes only).
wolffd@0 170 </ul>
wolffd@0 171
wolffd@0 172
wolffd@0 173 <p>
wolffd@0 174 <li>
wolffd@0 175 BNT supports several methods for <b>regularization</b>,
wolffd@0 176 and it is easy to add more.
wolffd@0 177 <ul>
wolffd@0 178 <li> Any node can have its parameters clamped (made non-adjustable).
wolffd@0 179 <li> Any set of compatible nodes can have their parameters tied (c.f.,
wolffd@0 180 weight sharing in a neural net).
wolffd@0 181 <li> Some node types (e.g., tabular) supports priors for MAP estimation.
wolffd@0 182 <li> Gaussian covariance matrices can be declared full or diagonal, and can
wolffd@0 183 be tied across states of their discrete parents (if any).
wolffd@0 184 </ul>
wolffd@0 185
wolffd@0 186 <p>
wolffd@0 187 <li>
wolffd@0 188 BNT supports several methods for <b>structure learning</b>,
wolffd@0 189 and it is easy to add more.
wolffd@0 190 <ul>
wolffd@0 191
wolffd@0 192 <li> Bayesian structure learning,
wolffd@0 193 using MCMC or local search (for fully observed tabular nodes only).
wolffd@0 194
wolffd@0 195 <li> Constraint-based structure learning (IC/PC and IC*/FCI).
wolffd@0 196 </ul>
wolffd@0 197
wolffd@0 198
wolffd@0 199 <p>
wolffd@0 200 <li> The source code is extensively documented, object-oriented, and free, making it
wolffd@0 201 an excellent tool for teaching, research and rapid prototyping.
wolffd@0 202
wolffd@0 203 </ul>
wolffd@0 204
wolffd@0 205
wolffd@0 206
wolffd@0 207 <h2><a name="models">Supported probabilistic models</h2>
wolffd@0 208 <p>
wolffd@0 209 It is trivial to implement all of
wolffd@0 210 the following probabilistic models using the toolbox.
wolffd@0 211 <ul>
wolffd@0 212 <li>Static
wolffd@0 213 <ul>
wolffd@0 214 <li> Linear regression, logistic regression, hierarchical mixtures of experts
wolffd@0 215
wolffd@0 216 <li> Naive Bayes classifiers, mixtures of Gaussians,
wolffd@0 217 sigmoid belief nets
wolffd@0 218
wolffd@0 219 <li> Factor analysis, probabilistic
wolffd@0 220 PCA, probabilistic ICA, mixtures of these models
wolffd@0 221
wolffd@0 222 </ul>
wolffd@0 223
wolffd@0 224 <li>Dynamic
wolffd@0 225 <ul>
wolffd@0 226
wolffd@0 227 <li> HMMs, Factorial HMMs, coupled HMMs, input-output HMMs, DBNs
wolffd@0 228
wolffd@0 229 <li> Kalman filters, ARMAX models, switching Kalman filters,
wolffd@0 230 tree-structured Kalman filters, multiscale AR models
wolffd@0 231
wolffd@0 232 </ul>
wolffd@0 233
wolffd@0 234 <li> Many other combinations, for which there are (as yet) no names!
wolffd@0 235
wolffd@0 236 </ul>
wolffd@0 237
wolffd@0 238
wolffd@0 239 <!--
wolffd@0 240 <h2><a name="future">Future work</h2>
wolffd@0 241
wolffd@0 242 I have a long <a href="wish.txt">wish list</a>
wolffd@0 243 of features I would like to add to BNT
wolffd@0 244 at some point in the future.
wolffd@0 245 Please email me (<a
wolffd@0 246 href="mailto:murphyk@cs.berkeley.edu">murphyk@cs.berkeley.edu</a>)
wolffd@0 247 if you are interested in contributing!
wolffd@0 248 -->
wolffd@0 249
wolffd@0 250
wolffd@0 251
wolffd@0 252 <h2><a name="give_away">Why do I give the code away?</h2>
wolffd@0 253
wolffd@0 254 <ul>
wolffd@0 255
wolffd@0 256 <li>
wolffd@0 257 I was hoping for a Linux-style effect, whereby people would contribute
wolffd@0 258 their own Matlab code so that the package would grow. With a few
wolffd@0 259 exceptions, this has not happened,
wolffd@0 260 although several people have provided bug-fixes (see the <a
wolffd@0 261 href="#ack">acknowledgements</a>).
wolffd@0 262 Perhaps the <a
wolffd@0 263 href="http://www.cs.berkeley.edu/~murphyk/OpenBayes/index.html">Open
wolffd@0 264 Bayes Project</a> will be more
wolffd@0 265 succesful in this regard, although the evidence to date is not promising.
wolffd@0 266
wolffd@0 267 <p>
wolffd@0 268 <li>
wolffd@0 269 Knowing that someone else might read your code forces one to
wolffd@0 270 document it properly, a good practice in any case, as anyone knows who
wolffd@0 271 has revisited old code.
wolffd@0 272 In addition, by having many "eye balls", it is easier to spot bugs.
wolffd@0 273
wolffd@0 274
wolffd@0 275 <p>
wolffd@0 276 <li>
wolffd@0 277 I believe in the concept of
wolffd@0 278 <a href="http://www-stat.stanford.edu/~donoho/Reports/1995/wavelab.pdf">
wolffd@0 279 reproducible research</a>.
wolffd@0 280 Good science requires that other people be able
wolffd@0 281 to replicate your experiments.
wolffd@0 282 Often a paper does not give enough details about how exactly an
wolffd@0 283 algorithm was implemented (e.g., how were the parameters chosen? what
wolffd@0 284 initial conditions were used?), and these can make a big difference in
wolffd@0 285 practice.
wolffd@0 286 Hence one should release the code that
wolffd@0 287 was actually used to generate the results in one's paper.
wolffd@0 288 This also prevents re-inventing the wheel.
wolffd@0 289
wolffd@0 290 <p>
wolffd@0 291 <li>
wolffd@0 292 I was fed up with reading papers where all people do is figure out how
wolffd@0 293 to do exact inference and/or learning
wolffd@0 294 in a model which is just a trivial special case of a general Bayes net, e.g.,
wolffd@0 295 input-output HMMs, coupled-HMMs, auto-regressive HMMs.
wolffd@0 296 My hope is that, by releasing general purpose software, the field can
wolffd@0 297 move on to more interesting questions.
wolffd@0 298 As Alfred North Whitehead said in 1911,
wolffd@0 299 "Civilization advances by extending the number of important operations
wolffd@0 300 that we can do without thinking about them."
wolffd@0 301
wolffd@0 302 </ul>
wolffd@0 303
wolffd@0 304
wolffd@0 305
wolffd@0 306
wolffd@0 307
wolffd@0 308 <h2><a name="why_matlab">Why Matlab?</h2>
wolffd@0 309
wolffd@0 310 Matlab is an interactive, matrix-oriented programming language that
wolffd@0 311 enables one to express one's (mathematical) ideas very concisely and directly,
wolffd@0 312 without having to worry about annoying details like memory allocation
wolffd@0 313 or type checking. This considerably reduces development time and
wolffd@0 314 keeps code short, readable and fully portable.
wolffd@0 315 Matlab has excellent built-in support for many data analysis and
wolffd@0 316 visualization routines. In addition, there are many useful toolboxes, e.g., for
wolffd@0 317 neural networks, signal and image processing.
wolffd@0 318 The main disadvantages of Matlab are that it can be slow (which is why
wolffd@0 319 we are currently rewriting parts of BNT in C), and that the commercial
wolffd@0 320 license is expensive (although the student version is only $100 in the US).
wolffd@0 321 <p>
wolffd@0 322 Many people ask me why I did not use
wolffd@0 323 <a href="http://www.octave.org/">Octave</a>,
wolffd@0 324 an open-source Matlab clone.
wolffd@0 325 The reason is that
wolffd@0 326 Octave does not support multi-dimensional arrays,
wolffd@0 327 cell arrays, objects, etc.
wolffd@0 328 <p>
wolffd@0 329 Click <a href="../which_language.html">here</a> for a more detailed
wolffd@0 330 comparison of matlab and other languages.
wolffd@0 331
wolffd@0 332
wolffd@0 333
wolffd@0 334 <h2><a name="ack">Acknowledgments</h2>
wolffd@0 335
wolffd@0 336 I would like to thank numerous people for bug fixes, including:
wolffd@0 337 Rainer Deventer, Michael Robert James, Philippe Leray, Pedrito Maynard-Reid II, Andrew Ng,
wolffd@0 338 Ron Parr, Ilya Shpitser, Xuejing Sun, Ursula Sondhauss.
wolffd@0 339 <p>
wolffd@0 340 I would like to thank the following people for contributing code:
wolffd@0 341 Pierpaolo Brutti, Ali Taylan Cemgil, Tamar Kushnir,
wolffd@0 342 Tom Murray,
wolffd@0 343 Nicholas Saunier,
wolffd@0 344 Ken Shan,
wolffd@0 345 Yair Weiss,
wolffd@0 346 Bob Welch,
wolffd@0 347 Ron Zohar.
wolffd@0 348 <p>
wolffd@0 349 The following Intel employees have also contributed code:
wolffd@0 350 Qian Diao, Shan Huang, Yimin Zhang and especially Wei Hu.
wolffd@0 351
wolffd@0 352 <p>
wolffd@0 353 I would like to thank Stuart Russell for funding me over the years as
wolffd@0 354 I developed BNT, and Gary Bradksi for hiring me as an intern at Intel,
wolffd@0 355 which has supported much of the recent developments of BNT.
wolffd@0 356
wolffd@0 357
wolffd@0 358 </body>
wolffd@0 359