Mercurial > hg > mtridoc
diff nime2012/mtriange.tex @ 18:37b3777c60c0
Re-fromated to follow NIME template
author | Henrik Ekeus <hekeus@eecs.qmul.ac.uk> |
---|---|
date | Sat, 04 Feb 2012 23:34:58 +0000 |
parents | 664c3852fca8 |
children | 0c0c62685abb |
line wrap: on
line diff
--- a/nime2012/mtriange.tex Sat Feb 04 20:48:22 2012 +0000 +++ b/nime2012/mtriange.tex Sat Feb 04 23:34:58 2012 +0000 @@ -11,8 +11,8 @@ \CopyrightYear{2012} %will cause 2008 to appear in the copyright line. \crdata{Copyright remains with the author(s).} -\conferenceinfo{NIME12,}{ Somewhere} - +\conferenceinfo{NIME'12,}{May 21 -- 23, 2012, University of Michigan, + Ann Arbor.} %TODO % @@ -28,13 +28,36 @@ % % \title{The Melody Triangle - Pattern and Predictability in Music} -\numberofauthors{2} +\numberofauthors{4} \author{ - \alignauthor Henrik Ekeus (1), Samer Abdallah (1), Mark D. Plumbley, Peter W. McOwan\\ - \affaddr{(1) Centre for Digital Music}\\ - \affaddr{Queen Mary University of London} +\alignauthor +Henrik Ekeus\\ + \affaddr{Queen Mary University of London}\\ + \affaddr{Media and Arts Technology}\\ + \affaddr{School of Electronic Engineering and Computer Science}\\ + \email{hekeus@eecs.qmul.ac.uk} +\alignauthor +Samer Abdallah\\ + \affaddr{Queen Mary University of London}\\ + \affaddr{Center for Digital Music}\\ + \affaddr{School of Electronic Engineering and Computer Science}\\ + \email{samer.abdallah@\\eecs.qmul.ac.uk} +\and +\alignauthor +Mark D. Plumbley\\ + \affaddr{Queen Mary University of London}\\ + \affaddr{Center for Digital Music}\\ + \affaddr{School of Electronic Engineering and Computer Science}\\ + \email{mark.plumbley@\\eecs.qmul.ac.uk} +% 3rd. author +\alignauthor +Peter W. McOwan\\ + \affaddr{Queen Mary University of London}\\ + \affaddr{Computer Vision Group}\\ + \affaddr{School of Electronic Engineering and Computer Science}\\ + \email{Peter.McOwan@\\eecs.qmul.ac.uk} } - +\date{7 February 2012} \begin{document} \maketitle \begin{abstract} @@ -64,7 +87,7 @@ Redundancy tells us the difference in uncertainty before we look at the context (the fixed point distribution) and the uncertainty after we look at context. For instance a transition matrix with high redundancy, such as one that represents a long periodic sequence, would have high uncertainty before we look at the context but as soon as we look at the previous symbol, the uncertainty drops to zero because we now know what is coming next. \subsubsection{Entropy rate} [todo - a more formal description] -Entropy rate is the average uncertainty for the next symbol as we go through the sequence. A looping sequence has 0 entropy, a sequence that is difficult to predict has high entropy rate. Entropy rate is an average of ÔsurprisingnessÕ over time. +Entropy rate is the average uncertainty for the next symbol as we go through the sequence. A looping sequence has 0 entropy, a sequence that is difficult to predict has high entropy rate. Entropy rate is an average of 'surprisingness' over time. \subsubsection{Predictive Information Rate} [todo - a more formal description] @@ -128,15 +151,15 @@ Tracking and control was done using the OpenNI libraries' API and high level middle-ware for tracking with Kinect. This provided reliable blob tracking of humanoid forms in 2d space. By triangulating this to the Kinect's depth map it became possible to get reliable coordinate of visitors positions in the space. This system was extended to detect gestures. By detecting the bounding box of the 2d blobs of individuals in the space, and then normalising these based on the distance of the depth map it became possible to work out if an individual had an arm stretched out or if they were crouching. -With this it was possible to define a series of gestures for controlling the system without the use of any controllers. Thus for instance by sticking out one's left arm quickly, the melody doubles in tempo. By pulling one's left arm in at the same time as sticking the right arm out the melody would shift onto the offbeat. Sending out both arms would change instrument. +With this it was possible to define a series of gestures for controlling the system without the use of any controllers(see table \ref{gestures}). Thus for instance by sticking out one's left arm quickly, the melody doubles in tempo. By pulling one's left arm in at the same time as sticking the right arm out the melody would shift onto the offbeat. Sending out both arms would change instrument. -\begin{figure} +\begin{table} \centering %\includegraphics[width=0.5\textwidth]{InstructionsText.pdf} -\begin{tabular}{ l l l } - +\caption{Gestures and their resulting effect\label{gestures}} +\begin{tabular}{ l c l } left arm & right arm & meaning\\ -\hline\\ +\hline out & static & double tempo \\ in & static & halve tempo \\ static & out & triple tempo \\ @@ -145,10 +168,7 @@ out & out & change instrument\\ in & in & reset tempo\\ \end{tabular} - - -\caption{Gestures and their resulting effect. For instance sending one's left arm out while keeping the right static would double the tempo of the melody being generated.\label{gestures}} -\end{figure} +\end{table} \subsubsection{Observations} Although visitors would need an initial bit of training they could then quickly be made to collaboratively design musical textures. For example, one person could lay down a predictable repeating bass line by keeping themselves to the periodicity/repetition side of the room, while a companion can generate a freer melodic line by being nearer the 'noise' part of the space.