diff nime2012/mtriange.tex @ 12:d26e4561d036

general improvments, new triangle image
author Henrik Ekeus <hekeus@eecs.qmul.ac.uk>
date Sat, 04 Feb 2012 01:57:04 +0000
parents 0d5cdc287fa4
children 5e2c50450ace
line wrap: on
line diff
--- a/nime2012/mtriange.tex	Sat Feb 04 00:38:55 2012 +0000
+++ b/nime2012/mtriange.tex	Sat Feb 04 01:57:04 2012 +0000
@@ -25,7 +25,7 @@
 \begin{document}
 \maketitle
 \begin{abstract}
-The Melody Triangle is a Markov-chain based melody generator where the input - positions within a triangle - directly map to information theoretic measures of its output.  The measures are the amount of entropy, redundancy and \emph{predictive information rate}\cite{Abdallah} in the melody. Predictive information rate is a \emph{time-varying} information measure developed as part of the Information Dynamics of Music project(IDyOM)\footnote{http://www.idyom.org/}.  It characterises temporal structure and is a way of modelling expectation and surprise in the perception of music. 
+The Melody Triangle is a Markov-chain based melody generator where the input - positions within a triangle - directly map to information theoretic measures of its output.  The measures are the entropy rate, redundancy and \emph{predictive information rate}\cite{Abdallah} in the melody. Predictive information rate is a \emph{time-varying} information measure developed as part of the Information Dynamics of Music project(IDyOM)\footnote{http://www.idyom.org/}.  It characterises temporal structure and is a way of modelling expectation and surprise in the perception of music. 
 
 We describe the Information Dynamics model and how it forms the basis of the Melody Triangle.  We outline two interfaces and uses of the system.  The first is a multi-user installation where collaboration in a performative setting provides a playful yet informative way to explore expectation and surprise in music.  The second is a screen based interface where the Melody Triangle becomes a compositional tool that allows for the generation of intricate musical textures using an abstract, high-level description of predictability. Finally we outline a study where the screen-based interface was used under experimental conditions to determine how the three measures of predictive information rate, entropy and redundancy might relate to musical preference. We found that\dots   	
 
@@ -39,27 +39,28 @@
 The research into Information Dynamics explores several different kinds of predictability in musical patterns, how human listeners might perceive these, and how they shape or affect the listening experience. [more on IDyOM]
 
 
-
-\subsection{Predictive Information Rate}
-[todo quick summary of predictive information rate]
-
-\subsection{Redundancy and Entropy}
-[todo quick summary of the redundancy and entropy measures used]
-
-
 \section{The Melody Triangle}
 %%%How we created the transition matrixes and created the triangle.
+The Melody Triangle is based on first order Markov chains, represented as transition matrixes, and generate streams of symbols.  By mapping the symbols to individual notes, melodies are generated and by layering these complex musical textures can be generated. The choice of notes or scale is not a part of the Melody Triangle's core functionality (the symbols could be mapped to anything, even non sonic outputs).  
 
-Given a stream of symbols we can calculate values for redundancy, entropy and predictive information rate as they happen.  The Melody Triangle goes 'backwards' - given values for redundancy, entropy and predictive information rate, we return a stream of symbols that match those measures.  
+Given a sequence of symbols we can make information theoretic measures on it.  The novelty of the Melody Triangle lies in that it goes 'backwards' - given values for these measures, as determined from the user interface, we return a stream of symbols that match those measures.  The information measures used are redundancy, entropy rate and predictive information rate.  
+
+\subsection{Information measures}
+\subsubsection{Redundancy}
+Redundancy tells us the difference in uncertainty before we look at the context (the fixed point distribution) and the uncertainty after we look at context.  For instance a transition matrix with high redundancy, such as one that represents a long periodic sequence, would have high uncertainty before we look at the context but as soon as we look at the previous symbol, the uncertainty drops to zero because we now know what is coming next.
+\subsubsection{Entropy rate}
+Entropy rate is the average uncertainty for the next symbol as we go through the sequence.  A looping sequence has 0 entropy, a sequence that is difficult to predict has high entropy rate.   Entropy rate is an average of ÔsurprisingnessÕ over time.  
 
-First order Markov chains, represented as transition matrixes, generate the streams.  By mapping the symbols to individual notes, melodies are generated.  By layering multiple instances of this process, complex musical textures can be generated. The choice of notes and scale is not a part of the Melody Triangle's core functionality, the symbols could be mapped to anything, even non sonic outputs.  
+\subsubsection{Predictive Information Rate}
+Predictive information rate tell us the average reduction in uncertainty upon perceiving a symbol; a system with high predictive information rate means that each symbol tells you more about the next one.  If we imagine a purely periodic sequence, each symbol tells you nothing about the next one that we didn't already know as we already know how the pattern is going.  Similarly with a seemingly uncorrelated sequence,  seeing the next symbol does not tell us anymore because they are completely independent anyway; there is no pattern.   There is a subset of transition matrixes that have high predictive information rate, and it is neither the periodic ones, nor the completely un-corellated ones.  Rather they tend to yield output that have certain characteristic patterns, however a listener can't necessarily know when they occur.  However a certain sequence of symbols might tell us about which one of the characteristics patterns will show up next.  Each symbols tell a us little bit about the future but nothing about the infinite future, we only learn about that as time goes on; there is continual building of prediction.
 
-\subsection{Transition Matrixes}
+
+
 \begin{figure}
 \centering
 \includegraphics[width=0.3\textwidth]{PeriodicMatrix.png}
 \includegraphics[width=0.3\textwidth]{NonDeterministicMatrix_bw.png}
-\caption{Two transition matrixes.  The colour represents the probabilities of transition from one state to the next (black=0, white=1). The current symbol is along the bottom, and in this case there are twelve possibilities (mapped to a chromatic scale).  The upper matrix has no uncertainty; it represents a periodic pattern. We can see for example that symbol 4 must follow symbol 3, and that symbol 10 must follow symbol 4, and so on.The lower matrix contains unpredictability but nonetheless is not completely without perceivable structure, it is of a higher entropy.   In the upper matrix,  \label{TransitionMatrixes}}
+\caption{Two transition matrixes.  The colour represents the probabilities of transition from one state to the next (black=0, white=1). The current symbol is along the bottom, and in this case there are twelve possibilities (mapped to a chromatic scale).  The upper matrix has no uncertainty; it represents a periodic pattern. We can see for example that symbol 4 must follow symbol 3, and that symbol 10 must follow symbol 4, and so on.The lower matrix contains unpredictability but nonetheless is not completely without perceivable structure, it is of a higher entropy rate. \label{TransitionMatrixes}}
 \end{figure}
 
 
@@ -68,17 +69,21 @@
 \begin{figure}
 \centering
 \includegraphics[width=0.5\textwidth]{MatrixDistribution.png}
-\caption{The population of transition matrixes distributed along three axes of redundancy, entropy rate and predictive -information rate.  Note how the distribution makes triangle-like plane floating in 3d space.\label{InfoDynEngine}}
+\caption{The population of transition matrixes distributed along three axes of redundancy, entropy rate and predictive -information rate.  Note how the distribution makes triangle-like plane floating in 3d space. [todo get higher res screen shot, rotate so we can see triangle better] \label{InfoDynEngine}}
 \end{figure}
-  
Hundreds of transition matrixes are randomly generated, and they are then placed in a 3d statistical space based on the information measures calculated from the matrix, these are redundancy, entropy rate, and predictive-information rate [see [cite]].  In fig.\ref{InfoDynEngine} on the right, we see a representation of these matrixes distributed; each one of these points corresponds to a transition matrix.  Entropy rate is the average uncertainty for the next symbol as we go through the sequence.  A looping sequence has 0 entropy, a sequence that is difficult to predict has high entropy rate.   Entropy rate is an average of ÔsurprisingnessÕ over time.  
-
Redundancy tells us the difference in uncertainty before we look at the context (the fixed point distribution) and the uncertainty after we look at context.  For instance a matrix with high redundancy, such as one that represents a long periodic sequence, would have high uncertainty before we look at the context but as soon as we look at the previous symbol, the uncertainty drops to zero because we now know what is coming next.

Predictive information rate tell us the average reduction in uncertainty upon perceiving a symbol; a system with high predictive information rate means that each symbol tells you more about the next one.  If we imagine a purely periodic sequence, each symbol tells you nothing about the next one that we didn't already know as we already know how the pattern is going.  Similarly with a seemingly uncorrelated sequence,  seeing the next symbol does not tell us anymore because they are completely independent anyway; there is no pattern.   There is a subset of transition matrixes that have high predictive information rate, and it is neither the periodic ones, nor the completely un-corellated ones.  Rather they tend to yield output that have certain characteristic patterns, however a listener can't necessarily know when they occur.  However a certain sequence of symbols might tell us about which one of the characteristics patterns will show up next.  Each symbols tell a us little bit about the future but nothing about the infinite future, we only learn about that as time goes on; there is continual building of prediction.

When we look at the distribution of matrixes generated by a random sampling method in this 3d space of entropy rate, redundancy and predictive information rate, it forms an arch shape that is fairly thin, and it thus becomes a reasonable approximation to pretend that it is just a sheet in two dimensions(see fig.\ref{InfoDynEngine}).  It is this triangular sheetfig.\ref{TheTriangle} that is then mapped either to the screen, or in the case of the interactive installation, physical space.  Each corner corresponding to three different extremes of predictability/unpredictability, which could be loosely characterised as periodicity, noise and repetition.  
-
-\begin{figure}
+  
+\subsection{Making the triangle}
Hundreds of transition matrixes are generated by a random sampling method.  These are then plotted in a 3d statistical space of redundancy, entropy rate and predictive information rate.  In figure \ref{InfoDynEngine} we see a representation of how these matrixes are distributed; each one of these points corresponds to a transition matrix.  
+  
+  \begin{figure}[h]
 \centering
 \includegraphics[width=0.5\textwidth]{TheTriangle.pdf}
-\caption{The Melody Triangle \label{TheTriangle}}
+\caption{The Melody Triangle [todo fix shading to be more triangular]  \label{TheTriangle}}
 \end{figure}
 
+  
When we look at the distribution of matrixes generated by this random sampling method, we see that it forms an arch shape that is fairly thin, and it thus becomes a reasonable approximation to pretend that it is just a sheet in two dimensions.  It is this triangular sheet that is our 'Melody Triangle'.  It is the interface that is mapped either to the screen, or in the case of the interactive installation, physical space.  Each corner correspondes to three different extremes of predictability/unpredictability, which could be loosely characterised as periodicity, noise and repetition. [expand, make better] 
+
+
+
 
 
 
@@ -87,7 +92,7 @@
 
 \subsection{The Multi-User Installation}
 
-
+[old text - rewrite!]
 
  the statistical properties of this melody is based on where in the physical room the participant is standing, as this is mapped to a statistical space (see below).  By exploring the physical space the participants thus explore the predictability of the melodic and rhythmical patterns, based on a simple model of how might guess the next musical event given the previous one.  
 \dots
@@ -122,7 +127,7 @@
 
 
 \subsection{The Screen Based Interface}
-
+[todo]
 [screen shot]
 On the screen is a triangle and a round token.
 
@@ -134,6 +139,7 @@
 
 
 \section{Musical Preference and Information Dynamics Study}
+[TODO!]
 The study was divided in to 5 subtasks.  The axes of the triangle would be randomly rearanged prior for each participant. 
 
  this first task, which will last [4/3] minutes, we simply ask you to move the token where ever in the triangle you wish,.  This allowed the participant to get use to the environment get use to the interface and get a sense of how position of tokens changes a melody.