changeset 6:99b7caedab90

added some kinnect text & images
author Henrik Ekeus <hekeus@eecs.qmul.ac.uk>
date Thu, 02 Feb 2012 17:47:14 +0000
parents d5702ab76262
children 9b14f9db3c30
files nime2012/InstructionsImage.pdf nime2012/kinnect.pdf nime2012/mtriange.pdf nime2012/mtriange.tex
diffstat 4 files changed, 12 insertions(+), 11 deletions(-) [+]
line wrap: on
line diff
Binary file nime2012/InstructionsImage.pdf has changed
Binary file nime2012/kinnect.pdf has changed
Binary file nime2012/mtriange.pdf has changed
--- a/nime2012/mtriange.tex	Thu Feb 02 17:34:19 2012 +0000
+++ b/nime2012/mtriange.tex	Thu Feb 02 17:47:14 2012 +0000
@@ -73,8 +73,19 @@
 
 When multiple people are in the space, they can cooperate to create musical polyphonic textures.   For example, one person could lay down a predictable repeating bass line by keeping themselves to the periodicity/repetition side of the room, while a companion can generate a freer melodic line by being nearer the 'noise' part of the space.
 
+A Kinnect camera was used to tack individuals in the space, an application developed in OpenFrameworks would  send the individuals positions and a bounding box values (for gesture recognition), to an application running the Information Dynamics Engine [Matlab/Prolog].  This would map the input to a series of symbols that would go over OSC to Max/MSP, where these would be interpreted as MIDI, mapped to channels and the data passed on to Logic for output using software instruments.
+\subsubsection{Tracking and Control}
+It uses the OpenNI libraries' API and high level middle-ware for tracking with Kinnect.  This provided reliable blob tracking of humanoid forms in 2d space.  By then triangulating this to the Kinnect's depth map it became possible to get reliable coordinate of visitors positions in the space.
 
+\begin{figure*}[t]
+\centering
+\includegraphics[width=1\textwidth]{kinnect.pdf}
+\caption{?On the left we see the depth map as seen by the Kinnect, and the bounding box outlines the blobs detected by OpenNI.  One the right is a an eagle-eye view of the position of individuals tracked in the space.   These are the positions that are mapped to the statistical space of the information dynamics engine.\label{Kinnect}}
+\end{figure*}
 
+
This system was then further extended to detect gestures.  By detecting the bounding box of the 2d blobs of individuals in the space, and then normalising these based on the distance of the depth map it became possible to work out if an individual had a an arm stretched out or if they were crouching.  
+
+With this it was possible to define a series of gestures for controlling the system without the use of any controllers.  Thus for instance by sticking out one's left arm quickly, the melody doubles in tempo.  By pulling one's left arm in at the same time as sticking the right arm out the melody would shift onto the offbeat.   Sending out both arms would change instrument.    
 
 \subsection{The Screen Based Interface}
 
@@ -86,19 +97,9 @@
 
 
 
-\section{Information Dynamics and Musical Preference}
 
 
-
-
-
-
-
-\section{Technical Implementation (needed?)}
-
-an application developed in OpenFrameworks would  send the individuals positions and a bounding box values (for gesture recognition), to an application running the Information Dynamics Engine [Matlab/Prolog]. 
-
-\section{Experimental Procedure}
+\section{Musical Preference and Information Dynamics Study}
 The study was divided in to 5 subtasks.  The axes of the triangle would be randomly rearanged prior for each participant. 
 
  this first task, which will last [4/3] minutes, we simply ask you to move the token where ever in the triangle you wish,.  This allowed the participant to get use to the environment get use to the interface and get a sense of how position of tokens changes a melody.