comparison nime2012/mtriange.tex @ 6:99b7caedab90

added some kinnect text & images
author Henrik Ekeus <hekeus@eecs.qmul.ac.uk>
date Thu, 02 Feb 2012 17:47:14 +0000
parents d5702ab76262
children 9b14f9db3c30
comparison
equal deleted inserted replaced
5:d5702ab76262 6:99b7caedab90
77 \dots 77 \dots
78 78
79 79
80 When multiple people are in the space, they can cooperate to create musical polyphonic textures. For example, one person could lay down a predictable repeating bass line by keeping themselves to the periodicity/repetition side of the room, while a companion can generate a freer melodic line by being nearer the 'noise' part of the space. 80 When multiple people are in the space, they can cooperate to create musical polyphonic textures. For example, one person could lay down a predictable repeating bass line by keeping themselves to the periodicity/repetition side of the room, while a companion can generate a freer melodic line by being nearer the 'noise' part of the space.
81 81
82 A Kinnect camera was used to tack individuals in the space, an application developed in OpenFrameworks would send the individuals positions and a bounding box values (for gesture recognition), to an application running the Information Dynamics Engine [Matlab/Prolog]. This would map the input to a series of symbols that would go over OSC to Max/MSP, where these would be interpreted as MIDI, mapped to channels and the data passed on to Logic for output using software instruments.
83 \subsubsection{Tracking and Control}
84 It uses the OpenNI libraries' API and high level middle-ware for tracking with Kinnect. This provided reliable blob tracking of humanoid forms in 2d space. By then triangulating this to the Kinnect's depth map it became possible to get reliable coordinate of visitors positions in the space.
85
86 \begin{figure*}[t]
87 \centering
88 \includegraphics[width=1\textwidth]{kinnect.pdf}
89 \caption{?On the left we see the depth map as seen by the Kinnect, and the bounding box outlines the blobs detected by OpenNI. One the right is a an eagle-eye view of the position of individuals tracked in the space. These are the positions that are mapped to the statistical space of the information dynamics engine.\label{Kinnect}}
90 \end{figure*}
82 91
83 92
93 This system was then further extended to detect gestures. By detecting the bounding box of the 2d blobs of individuals in the space, and then normalising these based on the distance of the depth map it became possible to work out if an individual had a an arm stretched out or if they were crouching.
94
95 With this it was possible to define a series of gestures for controlling the system without the use of any controllers. Thus for instance by sticking out one's left arm quickly, the melody doubles in tempo. By pulling one's left arm in at the same time as sticking the right arm out the melody would shift onto the offbeat. Sending out both arms would change instrument.
84 96
85 \subsection{The Screen Based Interface} 97 \subsection{The Screen Based Interface}
86 98
87 [screen shot] 99 [screen shot]
88 On the screen is a triangle and a round token. 100 On the screen is a triangle and a round token.
90 With the mouse you can click and drag the red token and move it around the screen. 102 With the mouse you can click and drag the red token and move it around the screen.
91 When the red token is dragged into the triangle, the system will start generating a sequence of piano notes. The pattern of notes depends on where in the triangle the token is 103 When the red token is dragged into the triangle, the system will start generating a sequence of piano notes. The pattern of notes depends on where in the triangle the token is
92 104
93 105
94 106
95 \section{Information Dynamics and Musical Preference}
96 107
97 108
98 109 \section{Musical Preference and Information Dynamics Study}
99
100
101
102
103 \section{Technical Implementation (needed?)}
104
105 an application developed in OpenFrameworks would send the individuals positions and a bounding box values (for gesture recognition), to an application running the Information Dynamics Engine [Matlab/Prolog].
106
107 \section{Experimental Procedure}
108 The study was divided in to 5 subtasks. The axes of the triangle would be randomly rearanged prior for each participant. 110 The study was divided in to 5 subtasks. The axes of the triangle would be randomly rearanged prior for each participant.
109 111
110 this first task, which will last [4/3] minutes, we simply ask you to move the token where ever in the triangle you wish,. This allowed the participant to get use to the environment get use to the interface and get a sense of how position of tokens changes a melody. 112 this first task, which will last [4/3] minutes, we simply ask you to move the token where ever in the triangle you wish,. This allowed the participant to get use to the environment get use to the interface and get a sense of how position of tokens changes a melody.
111 113
112 In the following tasks a background melody is playing and the participant are asked to find a second melody that 'works well' with the background melody. In each of these tasks the background melody has different statistical properties. In the first it ....., In the second the background melody ... in the third... And finally in the fourth case the melody is in the middle of the triangle, that is it.... 114 In the following tasks a background melody is playing and the participant are asked to find a second melody that 'works well' with the background melody. In each of these tasks the background melody has different statistical properties. In the first it ....., In the second the background melody ... in the third... And finally in the fourth case the melody is in the middle of the triangle, that is it....