June 2, 2012 16 min
June 2, 2012 56 min
June 2, 2012 54 min
June 2, 2012 44 min
June 2, 2012 59 min
June 2, 2012 28 min
June 2, 2012 59 min
June 2, 2012 57 min
June 28, 2012 09 min
June 28, 2012 09 min
June 28, 2012 13 min
June 28, 2012 09 min
June 28, 2012 11 min
June 25, 2012 07 min
June 25, 2012 05 min
0:00/0:00
Abstract: The wealth of tools developed in music information retrieval (MIR) for the description, indexation, and retrieval of music and sound can be easily (ab)used for the creation of new musical material and sound design. Based on automated audio description and selection, corpus-based concatenative synthesis allows to exploit large collections of sound to compose novel timbral and harmonic structures. The metaphor for musical creation is here an explorative navigation through the sonic landscape of the corpus. We will present examples and applications of real-time interactive corpus-based concatenative synthesis for music composition, sound design, installations, and interactive performance.
Bio: Diemo Schwarz is researcher–developer at the Real-Time Music Interaction (IMTR) team at Ircam, working on sound analysis and interactive corpus-based concatenative synthesis in multiple research and musical projects at the intersection between computer science, music technology, and audio-visual creation. He holds a PhD in computer science applied to music from the University of Paris, awarded in 2004 for the development of a new method of concatenative musical sound synthesis by unit selection from a large database.