Wiki » History » Version 3

Version 2 (Becky Stewart, 2011-03-04 05:49 PM) → Version 3/5 (Becky Stewart, 2011-03-04 06:07 PM)

h1. Project Summary

This project was first developed for the Royal Academy of Engineering exhibit at the Big Bang Fair 2011. The aim was to create an engaging interface that creates music with the Kinect in order to encourage students study music technology.

The Kinect camera can track up to 3 people within a space approximately 3m x 3m. The positions of the people relative to the camera affect the music generated with person controlling different instrument.

h1. System Architecture

h2. User Tracking

Kinect. OpenNI Openframeworks extension.

h2. Metronome

Max/MSP.

h2. Chord and Note Generator

Processing.

h2. Synthesiser

Ableton with Max for Live.

!movesynth.png!

h1. How to Set It Up

First, make sure you have the most recent code from the repository.

h2. Set Up Network

# On a Mac, click on the wireless icon next to the clock and select "Create Network..." from the drop-down menu.
# Choose a name and password for the network. Use the Automatic channel selection. Five letter network names and passwords have proven to be less error-prone when pushing around OSC data, though this may mostly be superstition.
# Open System Preferences, go to Network Settings, click on "Advanced..." and then select the TCP/IP tab. Note the IP address. Repeat this process for any other computer being used on the network.

h2. Start Chord Generator

# Open HeresyBigBangDone.pde in Processing (be sure to use version 1.2.1 or later).
# In the file Header, set the NetAddress ableton variable to the IP address of the computer running Ableton.

h2. Start Metronome
h2. Start Camera Tracking
h2. Start Synthesiser