Mercurial > hg > mtridoc
changeset 8:9e62017140e6
Some slightly better text, and better pics..
author | Henrik Ekeus <hekeus@eecs.qmul.ac.uk> |
---|---|
date | Fri, 03 Feb 2012 12:59:23 +0000 |
parents | 9b14f9db3c30 |
children | 07cb047f8d13 |
files | nime2012/MatrixDistribution.png nime2012/NonDeterministicMatrix.png nime2012/NonDeterministicMatrix_bw.png nime2012/PeriodicMatrix.png nime2012/mtriange.pdf nime2012/mtriange.tex |
diffstat | 6 files changed, 15 insertions(+), 5 deletions(-) [+] |
line wrap: on
line diff
--- a/nime2012/mtriange.tex Thu Feb 02 18:12:54 2012 +0000 +++ b/nime2012/mtriange.tex Fri Feb 03 12:59:23 2012 +0000 @@ -34,21 +34,31 @@ \section{Introduction} - Music generally involves patterns in time. Composers commonly, consciously or not, play with his or her audience's expectations by setting up patterns that seem more or less predictable, and thus manipulate expectations and surprise in the listener[ref]. The research into Information Dynamics explores several different kinds of predictability in musical patterns, how human listeners might perceive these, and how they shape or affect the listening experience. + Music generally patterns in time. Composers commonly, consciously or not, play with his or her audience's expectations by setting up patterns that seem more or less predictable, and thus manipulate expectations and surprise in the listener[ref]. The research into Information Dynamics explores several different kinds of predictability in musical patterns, how human listeners might perceive these, and how they shape or affect the listening experience. \section{Information Dynamics and the Triangle } (some background on IDyOM and Markov chains) + + + + \begin{figure*}[t] \centering -\includegraphics[width=1.0\textwidth]{InfoDynEngine.pdf} -\caption{Screen Shot from the Information Dynamics engine - the current and next transition matrixes are on the left. The upper one has no uncertainty and thus represents a periodic pattern. The lower one contains unpredictability but nonetheless is not completely without perceivable structure. On the right we see the population of transition matrixes distributed along three axies of redundancy, entropy rate and predictive -information rate. Note how the distribution makes triangle-like plane floating in 3d space.\label{InfoDynEngine}} +\includegraphics[width=0.3\textwidth]{PeriodicMatrix.png} +\includegraphics[width=0.3\textwidth]{NonDeterministicMatrix_bw.png} +\caption{Two transition matrixies. The colour represents the probabilities of transition from one state to the next (black=0, white=1). The left matrix has no uncertainty. It represents a periodic pattern. The right hand matrix contains unpredictability but nonetheless is not completely without perceivable structure.\label{TransitionMatrixes}} \end{figure*} +The Information Dynamics model operates on discreet symbols, only at the output stage is any symbol mapped to a particular note. Each stream of symbols is at any one time defined by a transition matrix. A transition matrix defines the probabilistic distribution for the symbol following the current one. In fig.\ref{TransitionMatrixes} we see two transition matrixes, the one on the left has no uncertainty and therefore outlines a periodic pattern. The one on the right has unpredictability but is nonetheless not completely without perceivable structure, it is of a higher entropy. The current symbol is along the bottom, and in this case there are twelve possibilities (mapped to a chromatic scale). In the left hand matrix, we can see for example that symbol 4 must follow symbol 3, and that symbol 10 must follow symbol 4, and so on. - -The Information Dynamics model operates on discreet symbols, only at the output stage is any symbol mapped to a particular note. Each stream of symbols is at any one time defined by a transition matrix. A transition matrix defines the probabilistic distribution for the symbol following the current one. In fig.\ref{InfoDynEngine}, on the left we see two transition matrices, the upper one having no uncertainty and therefore outlining a periodic pattern. The lower one containing considerable unpredictability but is nonetheless not completely without perceivable structure, it is of a higher entropy. The current symbol is along the bottom, and in this case there are twelve possibilities. In the upper matrix in fig, we can see for example that symbol 4 must follow symbol 3, and that symbol 10 must follow symbol 4, and so on. Hundreds of transition matrixes are generated, and they are then placed in a 3d statistical space based on 3 information measures calculated from the matrix, these are redundancy, entropy rate, and predictive-information rate [see [cite]]. In fig.\ref{InfoDynEngine} on the right, we see a representation of these matrixes distributed; each one of these points corresponds to a transition matrix. Entropy rate is the average uncertainty for the next symbol as we go through the sequence. A looping sequence has 0 entropy, a sequence that is difficult to predict has high entropy rate. Entropy rate is an average of ÔsurprisingnessÕ over time. +\begin{figure*}[t] +\centering +\includegraphics[width=0.75\textwidth]{MatrixDistribution.png} +\caption{The population of transition matrixes distributed along three axes of redundancy, entropy rate and predictive -information rate. Note how the distribution makes triangle-like plane floating in 3d space.\label{InfoDynEngine}} +\end{figure*} + Hundreds of transition matrixes are generated, and they are then placed in a 3d statistical space based on 3 information measures calculated from the matrix, these are redundancy, entropy rate, and predictive-information rate [see [cite]]. In fig.\ref{InfoDynEngine} on the right, we see a representation of these matrixes distributed; each one of these points corresponds to a transition matrix. Entropy rate is the average uncertainty for the next symbol as we go through the sequence. A looping sequence has 0 entropy, a sequence that is difficult to predict has high entropy rate. Entropy rate is an average of ÔsurprisingnessÕ over time. Redundancy tells us the difference in uncertainty before we look at the context (the fixed point distribution) and the uncertainty after we look at context. For instance a matrix with high redundancy, such as one that represents a long periodic sequence, would have high uncertainty before we look at the context but as soon as we look at the previous symbol, the uncertainty drops to zero because we now know what is coming next. Predictive information rate tell us the average reduction in uncertainty upon perceiving a symbol; a system with high predictive information rate means that each symbol tells you more about the next one. If we imagine a purely periodic sequence, each symbol tells you nothing about the next one, that we didn't already know as we already know how the pattern is going. Similarly with a seemingly uncorrelated sequence, seeing the next symbol does not tell us anymore because they are completely independent anyway; there is no pattern. There is a subset of transition matrixes that have high predictive information rate, and it is neither the periodic ones, nor the completely un-corellated ones. Rather they tend to yield output that have certain characteristic patterns, however a listener can't necessarily know when they occur. However a certain sequence of symbols might tell us about which one of the characteristics patterns will show up next. Each symbols tell a us little bit about the future but nothing about the infinite future, we only learn about that as time goes on; there is continual building of prediction. When we look at the distribution of matrixes generated by a random sampling method in this 3d space of entropy rate, redundancy and predictive information rate, it forms an arch shape that is fairly thin, and it thus becomes a reasonable approximation to pretend that it is just a sheet in two dimensions(see fig.\ref{InfoDynEngine}). It is this triangular sheetfig.\ref{TheTriangle} that is then mapped either to the screen, or in the case of the interactive installation, physical space. Each corner corresponding to three different extremes of predictability/unpredictability, which could be loosely characterised as periodicity, noise and repetition. \begin{figure*}[t]