annotate toolboxes/FullBNT-1.0.7/docs/majorFeatures.html @ 0:cc4b1211e677 tip

initial commit to HG from Changeset: 646 (e263d8a21543) added further path and more save "camirversion.m"
author Daniel Wolff
date Fri, 19 Aug 2016 13:07:06 +0200
parents
children
rev   line source
Daniel@0 1
Daniel@0 2 <h2><a name="features">Major features</h2>
Daniel@0 3 <ul>
Daniel@0 4
Daniel@0 5 <li> BNT supports many types of
Daniel@0 6 <b>conditional probability distributions</b> (nodes),
Daniel@0 7 and it is easy to add more.
Daniel@0 8 <ul>
Daniel@0 9 <li>Tabular (multinomial)
Daniel@0 10 <li>Gaussian
Daniel@0 11 <li>Softmax (logistic/ sigmoid)
Daniel@0 12 <li>Multi-layer perceptron (neural network)
Daniel@0 13 <li>Noisy-or
Daniel@0 14 <li>Deterministic
Daniel@0 15 </ul>
Daniel@0 16 <p>
Daniel@0 17
Daniel@0 18 <li> BNT supports <b>decision and utility nodes</b>, as well as chance
Daniel@0 19 nodes,
Daniel@0 20 i.e., influence diagrams as well as Bayes nets.
Daniel@0 21 <p>
Daniel@0 22
Daniel@0 23 <li> BNT supports static and dynamic BNs (useful for modelling dynamical systems
Daniel@0 24 and sequence data).
Daniel@0 25 <p>
Daniel@0 26
Daniel@0 27 <li> BNT supports many different <b>inference algorithms</b>,
Daniel@0 28 and it is easy to add more.
Daniel@0 29
Daniel@0 30 <ul>
Daniel@0 31 <li> Exact inference for static BNs:
Daniel@0 32 <ul>
Daniel@0 33 <li>junction tree
Daniel@0 34 <li>variable elimination
Daniel@0 35 <li>brute force enumeration (for discrete nets)
Daniel@0 36 <li>linear algebra (for Gaussian nets)
Daniel@0 37 <li>Pearl's algorithm (for polytrees)
Daniel@0 38 <li>quickscore (for QMR)
Daniel@0 39 </ul>
Daniel@0 40
Daniel@0 41 <p>
Daniel@0 42 <li> Approximate inference for static BNs:
Daniel@0 43 <ul>
Daniel@0 44 <li>likelihood weighting
Daniel@0 45 <li> Gibbs sampling
Daniel@0 46 <li>loopy belief propagation
Daniel@0 47 </ul>
Daniel@0 48
Daniel@0 49 <p>
Daniel@0 50 <li> Exact inference for DBNs:
Daniel@0 51 <ul>
Daniel@0 52 <li>junction tree
Daniel@0 53 <li>frontier algorithm
Daniel@0 54 <li>forwards-backwards (for HMMs)
Daniel@0 55 <li>Kalman-RTS (for LDSs)
Daniel@0 56 </ul>
Daniel@0 57
Daniel@0 58 <p>
Daniel@0 59 <li> Approximate inference for DBNs:
Daniel@0 60 <ul>
Daniel@0 61 <li>Boyen-Koller
Daniel@0 62 <li>factored-frontier/loopy belief propagation
Daniel@0 63 </ul>
Daniel@0 64
Daniel@0 65 </ul>
Daniel@0 66 <p>
Daniel@0 67
Daniel@0 68 <li>
Daniel@0 69 BNT supports several methods for <b>parameter learning</b>,
Daniel@0 70 and it is easy to add more.
Daniel@0 71 <ul>
Daniel@0 72
Daniel@0 73 <li> Batch MLE/MAP parameter learning using EM.
Daniel@0 74 (Each node type has its own M method, e.g. softmax nodes use IRLS,<br>
Daniel@0 75 and each inference engine has its own E method, so the code is fully modular.)
Daniel@0 76
Daniel@0 77 <li> Sequential/batch Bayesian parameter learning (for fully observed tabular nodes only).
Daniel@0 78 </ul>
Daniel@0 79
Daniel@0 80
Daniel@0 81 <p>
Daniel@0 82 <li>
Daniel@0 83 BNT supports several methods for <b>regularization</b>,
Daniel@0 84 and it is easy to add more.
Daniel@0 85 <ul>
Daniel@0 86 <li> Any node can have its parameters clamped (made non-adjustable).
Daniel@0 87 <li> Any set of compatible nodes can have their parameters tied (c.f.,
Daniel@0 88 weight sharing in a neural net).
Daniel@0 89 <li> Some node types (e.g., tabular) supports priors for MAP estimation.
Daniel@0 90 <li> Gaussian covariance matrices can be declared full or diagonal, and can
Daniel@0 91 be tied across states of their discrete parents (if any).
Daniel@0 92 </ul>
Daniel@0 93
Daniel@0 94 <p>
Daniel@0 95 <li>
Daniel@0 96 BNT supports several methods for <b>structure learning</b>,
Daniel@0 97 and it is easy to add more.
Daniel@0 98 <ul>
Daniel@0 99
Daniel@0 100 <li> Bayesian structure learning,
Daniel@0 101 using MCMC or local search (for fully observed tabular nodes only).
Daniel@0 102
Daniel@0 103 <li> Constraint-based structure learning (IC/PC and IC*/FCI).
Daniel@0 104 </ul>
Daniel@0 105
Daniel@0 106
Daniel@0 107 <p>
Daniel@0 108 <li> The source code is extensively documented, object-oriented, and free, making it
Daniel@0 109 an excellent tool for teaching, research and rapid prototyping.
Daniel@0 110
Daniel@0 111 </ul>
Daniel@0 112
Daniel@0 113