Vous constatez une erreur ?
NaN:NaN
00:00
Music audio concepts and design for VR are relatively under-explored compared to visual analogs. This paper ad- dresses the design and evaluation of a dynamic experience of music in Web VR called VRTGO, in which a user explores three alternative versions of the same song called “VTGO (Vertigo)”. Each song version is represented visually by a spiral of blocks emanating from separately located vortices on the ground. The closer the avatar is to one of the three- block spirals, the closer the audio experience is to one of the three intended versions of the song, in terms of tempo and instrumental mix. We describe the use of granular synthesis to achieve smooth tempo transitions, and present findings from a user study exploring engagement and enjoyment. The study finds that participants spend significantly longer in the dynamic version of the experience and report significantly greater engagement, though no significant differences in en- joyment are found between the dynamic and static versions. While users’ stated benefits of VR music experiences include enhanced creativity and emotional engagement, drawbacks such as accessibility and over-immersion risks are identified. The technical contributions and results of this work could be of interest to music artists and producers looking to explore the possibilities associated with rendering their creations in VR, and how these experiences enable novel ways to engage and interact with their audience.
La WAC est une conférence internationale consacrée aux technologies et applications audio sur le web, accueillant des chercheurs, chercheuses, développeurs, développeuses et des artistes, pour discuter de la recherche universitaire et artistique, du développement, de la conception, de l'évaluation et des normes concernant les technologies web émergentes liées à l'audio. Pour ses 10 ans, thématisés « Hacking and Making with Web Audio », la WAC invite à explorer de nouvelles utilisations de l'API audio web.
20 novembre 2025
20 novembre 2025
20 novembre 2025
20 novembre 2025
20 novembre 2025
20 novembre 2025
20 novembre 2025
This paper introduces p5.spatial.js, an open source JavaScript library for creating multichannel sound works in the web browser. Designed to extend the popular creative coding environment p5.js, p5.spatial.js adds multichan- nel audio outpu
20 novembre 2025
20 novembre 2025
20 novembre 2025
20 novembre 2025
20 novembre 2025
20 novembre 2025
20 novembre 2025
20 novembre 2025
20 novembre 2025
This article explores the frontiers of interactive, modular music creation by presenting how existing WAM plugins, available on the Web in their 2D version, can be deployed in real-time 3D collaborative environments without any modification
20 novembre 2025
RhizomeBridge is a VST/AU plugin that enables real- time, bidirectional communication with minimal latency and sample-accurate precision between a Digital Audio Workstation (DAW) and an external Node.js process. Unlike protocol-based approa
20 novembre 2025
20 novembre 2025
Lately, the rise of AI generative systems has significantly influenced academic discourse on assisted composition, reshaping research agendas and scholarly practices. While generative tools can streamline exploratory workflows, they also au
20 novembre 2025
This talk presents a practice-based research that uses choreography to critique the often-concealed algorithms operating in the background of everyday web environments. By harnessing choreographic practices, researchers can both reinforce e
20 novembre 2025
Vous constatez une erreur ?
1, place Igor-Stravinsky
75004 Paris
+33 1 44 78 48 43
Du lundi au vendredi de 9h30 à 19h
Fermé le samedi et le dimanche
Hôtel de Ville, Rambuteau, Châtelet, Les Halles
Institut de Recherche et de Coordination Acoustique/Musique
Copyright © 2022 Ircam. All rights reserved.