Fiore Martin, 2016-08-11 12:04 PM
Collidoscope is an interactive, collaborative *sound installation* and *musical instrument* (granular synthesizer)
that allows participants to seamlessly record, manipulate, explore and perform real-world sounds.
It is designed for both *amateurs and professional musicians*.
Collidsocope massively drives on the *research* conducted at the "Center for Digital Music at Queen Mary University of London":http://c4dm.eecs.qmul.ac.uk
in real-time sound processing and synthesis, hardware design, physical computing, UI design, designing for interaction,
accessibility and collaboration.
The *granular synthesis engine* is based on Ross Becina's paper "Implementing Real-Time Granular Synthesis":http://www.cs.au.dk/~dsound/DigitalAudio.dir/Papers/BencinaAudioAnecdotes310801.pdf, on the
"TGrains Unit Generator":http://doc.sccode.org/Classes/TGrains.html of the SuperCollider language and on the "Cross-modal DAW Prototype":https://code.soundsoftware.ac.uk/projects/cmdp developed in the EPSRC
funded project Design Patterns for Inclusive Collaboration ("DePIC":http://depic.eecs.qmul.ac.uk)
Collidoscope was funded by the EPCRC Centre for Digital Music (C4DM) Platform Grant (EP/K009559/1) and the
EPSRC Design Patterns for Inclusive Collaboration (DePIC) grant (EP/J017205/1).