Do you notice a mistake?
NaN:NaN
00:00
Lately, the rise of AI generative systems has significantly influenced academic discourse on assisted composition, reshaping research agendas and scholarly practices. While generative tools can streamline exploratory workflows, they also automate key real-time choices, spectral shaping, rhythmic articulation, and gesture timing, thereby confining the composer’s reflective agency to the post-hoc evaluation of material that has already been generated.
In response to this issue, we present DawBi, a prototypical Max for Live plugin that opens a WebSocket-based, bidirectional dialogue between a composer’s Digital Audio Workstation and °’°KOBI, a web-based knowledge ecosystem that enhances creativity through semantic analysis and reflective feedback. Rather than generating music, the framework runs a real-time analytic loop: DawBi streams audio descriptors from the DAW to °’°KOBI, which hosts an annotated corpus of compositional works; °’°KOBI matches the incoming data to this corpus and returns the semantic tags of the closest musical pieces as a natural-language reply. The immediate link between evolving material and critically informed semantic descriptors prompts the composer to ques- tion, refine, and reposition the work in progress, sustaining reflective agency.
This continuous and asynchronous interaction between DawBi and °’°KOBI promotes a vision of assisted composition not as automatic substitution, but as reflective practice. Here, the system is not designed to produce music, but rather expands the critical, perceptual, and epistemic affordances of the compositional process, opening up new forms of co-creation at the intersection of art, code, and listening.
WAC is an international conference dedicated to web audio technologies and applications. The conference addresses academic research, artistic research, development, design, evaluation and standards concerned with emerging audio-related web technologies such as Web Audio API, Web RTC, WebSockets and Javascript.
The conference welcomes researchers, developers, designers, artists, and all people interested in the fields of web development and music technology.
Web Audio Conferences were previously held in 2015 at IRCAM and Mozilla in Paris, in 2016 at Georgia Tech in Atlanta, in 2017 at the Centre for Digital Music, Queen Mary University of London in London, in 2018 at TU Berlin in Berlin, in 2019 at the Norwegian University of Science and Technology in Trondheim, in 2021 at Universitat Pompeu Fabra and SonoSuite, Barcelona, Spain (virtual event), in 2022 at Université Côte d'Azur, Cannes, France and in 2024 at Purdue University, US.
For this 10 year anniversary, the proposed theme "Hacking and Making with Web Audio", is intended to invite scholars, researchers, developers, designers and artists to explore and engage with new usage of the Web Audio API across various disciplines and contexts. In particular, we will welcome original contributions that re-question and propose novel use of Web and Audio technologies, e.g. in their relation to more traditional music technologies and ecosystem, with or without the Internet or mixing paradigms such as mobile or IoT technologies.
November 20, 2025
November 20, 2025
November 20, 2025
November 20, 2025
November 20, 2025
November 20, 2025
November 20, 2025
This paper introduces p5.spatial.js, an open source JavaScript library for creating multichannel sound works in the web browser. Designed to extend the popular creative coding environment p5.js, p5.spatial.js adds multichan- nel audio outpu
November 20, 2025
November 20, 2025
November 20, 2025
November 20, 2025
November 20, 2025
November 20, 2025
November 20, 2025
November 20, 2025
November 20, 2025
This article explores the frontiers of interactive, modular music creation by presenting how existing WAM plugins, available on the Web in their 2D version, can be deployed in real-time 3D collaborative environments without any modification
November 20, 2025
Music audio concepts and design for VR are relatively under-explored compared to visual analogs. This paper ad- dresses the design and evaluation of a dynamic experience of music in Web VR called VRTGO, in which a user explores three altern
November 20, 2025
RhizomeBridge is a VST/AU plugin that enables real- time, bidirectional communication with minimal latency and sample-accurate precision between a Digital Audio Workstation (DAW) and an external Node.js process. Unlike protocol-based approa
November 20, 2025
November 20, 2025
This talk presents a practice-based research that uses choreography to critique the often-concealed algorithms operating in the background of everyday web environments. By harnessing choreographic practices, researchers can both reinforce e
November 20, 2025
Do you notice a mistake?