Mercurial > hg > cip2012
comparison draft.tex @ 17:e47aaea2ac28
Added images
author | Henrik Ekeus <hekeus@eecs.qmul.ac.uk> |
---|---|
date | Wed, 07 Mar 2012 15:12:49 +0000 |
parents | d5f63ea0f266 |
children | ca694f7dc3f9 |
comparison
equal
deleted
inserted
replaced
16:d5f63ea0f266 | 17:e47aaea2ac28 |
---|---|
238 The measures are the entropy rate, redundancy and predictive information rate of the random process used to generate the sequence of notes. | 238 The measures are the entropy rate, redundancy and predictive information rate of the random process used to generate the sequence of notes. |
239 These are all related to the predictability of the the sequence and as such address the notions of expectation and surprise in the perception of music.\emph{self-plagiarised} | 239 These are all related to the predictability of the the sequence and as such address the notions of expectation and surprise in the perception of music.\emph{self-plagiarised} |
240 | 240 |
241 Before the Melody Triangle can used, it has to be ÔpopulatedÕ with possible parameter values for the melody generators. | 241 Before the Melody Triangle can used, it has to be ÔpopulatedÕ with possible parameter values for the melody generators. |
242 These are then plotted in a 3d statistical space of redundancy, entropy rate and predictive information rate. | 242 These are then plotted in a 3d statistical space of redundancy, entropy rate and predictive information rate. |
243 In our case we generated thousands of transition matrixes, representing first-order Markov chains, by a random sampling method. In figure x we see a representation of how these matrixes are distributed in the 3d statistical space; each one of these points corresponds to a transition matrix.\emph{self-plagiarised} | 243 In our case we generated thousands of transition matrixes, representing first-order Markov chains, by a random sampling method. In figure \ref{InfoDynEngine} we see a representation of how these matrixes are distributed in the 3d statistical space; each one of these points corresponds to a transition matrix.\emph{self-plagiarised} |
244 | |
245 \begin{figure} | |
246 \centering | |
247 \includegraphics[width=0.5\textwidth]{MatrixDistribution.png} | |
248 \caption{The population of transition matrixes distributed along three axes of redundancy, entropy rate and predictive information rate. Note how the distribution makes a curved triangle-like plane floating in 3d space. \label{InfoDynEngine}} | |
249 \end{figure} | |
250 | |
244 | 251 |
245 When we look at the distribution of transition matrixes plotted in this space, we see that it forms an arch shape that is fairly thin. | 252 When we look at the distribution of transition matrixes plotted in this space, we see that it forms an arch shape that is fairly thin. |
246 It thus becomes a reasonable approximation to pretend that it is just a sheet in two dimensions; and so we stretch out this curved arc into a flat triangle. | 253 It thus becomes a reasonable approximation to pretend that it is just a sheet in two dimensions; and so we stretch out this curved arc into a flat triangle. |
247 It is this triangular sheet that is our ÔMelody TriangleÕ and forms the interface by which the system is controlled. \emph{self-plagiarised} | 254 It is this triangular sheet that is our ÔMelody TriangleÕ and forms the interface by which the system is controlled. \emph{self-plagiarised} |
248 | 255 |
249 When the Melody Triangle is used, regardless of whether it is as a screen based system, or as an interactive installation, it involves a mapping to this statistical space. | 256 When the Melody Triangle is used, regardless of whether it is as a screen based system, or as an interactive installation, it involves a mapping to this statistical space. |
250 When the user, through the interface, selects a position within the triangle, the corresponding transition matrix is returned. | 257 When the user, through the interface, selects a position within the triangle, the corresponding transition matrix is returned. |
251 Figure x shows how the triangle maps to different measures of redundancy, entropy rate and predictive information rate.\emph{self-plagiarised} | 258 Figure \ref{TheTriangle} shows how the triangle maps to different measures of redundancy, entropy rate and predictive information rate.\emph{self-plagiarised} |
252 | 259 \begin{figure} |
260 \centering | |
261 \includegraphics[width=0.5\textwidth]{TheTriangle.pdf} | |
262 \caption{The Melody Triangle\label{TheTriangle}} | |
263 \end{figure} | |
253 Each corner corresponds to three different extremes of predictability and unpredictability, which could be loosely characterised as ÔperiodicityÕ, ÔnoiseÕ and ÔrepetitionÕ. | 264 Each corner corresponds to three different extremes of predictability and unpredictability, which could be loosely characterised as ÔperiodicityÕ, ÔnoiseÕ and ÔrepetitionÕ. |
254 Melodies from the ÔnoiseÕ corner have no discernible pattern; they have high entropy rate, low predictive information rate and low redundancy. | 265 Melodies from the ÔnoiseÕ corner have no discernible pattern; they have high entropy rate, low predictive information rate and low redundancy. |
255 These melodies are essentially totally random. | 266 These melodies are essentially totally random. |
256 A melody along the ÔperiodicityÕ to ÔrepetitionÕ edge are all deterministic loops that get shorter as we approach the ÔrepetitionÕ corner, until it becomes just one repeating note. | 267 A melody along the ÔperiodicityÕ to ÔrepetitionÕ edge are all deterministic loops that get shorter as we approach the ÔrepetitionÕ corner, until it becomes just one repeating note. |
257 It is the areas in between the extremes that provide the more ÔinterestingÕ melodies. | 268 It is the areas in between the extremes that provide the more ÔinterestingÕ melodies. |