changeset 29:b78165a228db

tidy
author Henrik Ekeus <hekeus@eecs.qmul.ac.uk>
date Tue, 07 Feb 2012 16:38:40 +0000
parents dd51fad87cd9
children 83b0352b9fbe
files nime2012/mtriange.pdf nime2012/mtriange.tex
diffstat 2 files changed, 5 insertions(+), 6 deletions(-) [+]
line wrap: on
line diff
Binary file nime2012/mtriange.pdf has changed
--- a/nime2012/mtriange.tex	Tue Feb 07 01:36:56 2012 +0000
+++ b/nime2012/mtriange.tex	Tue Feb 07 16:38:40 2012 +0000
@@ -18,7 +18,6 @@
 %
 %formal descriptions of redundancy, entropy rate, predictive information rate
 %discussion on its use as a composition assistant.. 
-%comments on the aesthetics of the output (why it all sounds like minimalism)
 %better triangle diagram (fix shading)
 %
 %experiment section
@@ -78,14 +77,14 @@
 
 \subsection{Information measures}
 \subsubsection{Redundancy}
-[todo - a more formal description]
+[todo (Samer) - a more formal description]
 Redundancy tells us the difference in uncertainty before we look at the context (the fixed point distribution) and the uncertainty after we look at context.  For instance a transition matrix with high redundancy, such as one that represents a long periodic sequence, would have high uncertainty before we look at the context but as soon as we look at the previous symbol, the uncertainty drops to zero because we now know what is coming next.
 \subsubsection{Entropy rate}
-[todo - a more formal description]
+[todo (Samer) - a more formal description]
 Entropy rate is the average uncertainty for the next symbol as we go through the sequence.  A looping sequence has 0 entropy, a sequence that is difficult to predict has high entropy rate.   Entropy rate is an average of `surprisingness' over time.  
 
 \subsubsection{Predictive Information Rate}
-[todo - a more formal description]
+[todo (Samer) - a more formal description]
 Predictive information rate tell us the average reduction in uncertainty upon perceiving a symbol; a system with high predictive information rate means that each symbol tells you more about the next one.  
 
 If we imagine a purely periodic sequence, each symbol tells you nothing about the next one that we didn't already know as we already know how the pattern is going.  Similarly with a seemingly uncorrelated sequence,  seeing the next symbol does not tell us anymore because they are completely independent anyway; there is no pattern.   There is a subset of transition matrixes that have high predictive information rate, and it is neither the periodic ones, nor the completely un-corellated ones.  Rather they tend to yield output that have certain characteristic patterns, however a listener can't necessarily know when they occur.  However a certain sequence of symbols might tell us about which one of the characteristics patterns will show up next.  Each symbols tell a us little bit about the future but nothing about the infinite future, we only learn about that as time goes on; there is continual building of prediction.
@@ -111,7 +110,7 @@
   \begin{figure}[h]
 \centering
 \includegraphics[width=0.5\textwidth]{TheTriangle.pdf}
-\caption{The Melody Triangle [todo fix shading to be more triangular]  \label{TheTriangle}}
+\caption{The Melody Triangle\label{TheTriangle}}
 \end{figure}
 
 \subsection{Making the triangle}
@@ -126,7 +125,7 @@
    When the Melody Triangle is used, regardless of whether it is as a screen based system, or as an interactive installation, it involves a mapping to this statistical space. Then a transition matrix corresponding to this position in statistical space is returned. As can be seen in figure \ref{TheTriangle}, a position within the triangle maps to different measures of redundancy, entropy rate and predictive information rate.  
  
 %%%paragraph explaining what the different parts of the triangle are like.
-  [todo improve, include example melodies?] Each corner corresponds to three different extremes of predictability and unpredictability, which could be loosely characterised as periodicity, noise and repetition. Melodies from the `noise' corner have no discernible pattern; they have high entropy rate, low predictive information rate and low redundancy. These melodies are essentially totally random. A melody along the `periodicity' to `repetition' edge are all deterministic loops that get shorter as we approach the `repetition' corner, until it becomes just one repeating note.  It is the areas in between that provide the more interesting melodies, those that have some level of unpredictability, but are not completely random and conversely that are predictable, but not entirely so.  This triangular space allows for an intuitive exploration of expectation and surprise in temporal sequences based on a simple model of how one might guess the next event given the previous one.     
+Each corner corresponds to three different extremes of predictability and unpredictability, which could be loosely characterised as periodicity, noise and repetition. Melodies from the `noise' corner have no discernible pattern; they have high entropy rate, low predictive information rate and low redundancy. These melodies are essentially totally random. A melody along the `periodicity' to `repetition' edge are all deterministic loops that get shorter as we approach the `repetition' corner, until it becomes just one repeating note.  It is the areas in between that provide the more interesting melodies, those that have some level of unpredictability, but are not completely random and conversely that are predictable, but not entirely so.  This triangular space allows for an intuitive exploration of expectation and surprise in temporal sequences based on a simple model of how one might guess the next event given the previous one.