March 17, 2021 19 min
March 17, 2021 28 min
March 17, 2021 28 min
March 17, 2021 26 min
March 17, 2021 31 min
March 17, 2021 33 min
March 17, 2021 28 min
March 17, 2021 26 min
March 17, 2021 13 min
March 17, 2021 18 min
November 29, 2006 20 min
November 29, 2006 01 h 07 min
November 29, 2006 59 min
November 29, 2006 12 min
November 29, 2006 50 min
November 29, 2006 47 min
November 29, 2006 18 min
November 29, 2006 51 min
0:00/0:00
During the past decade, new object-based immersive audio content formats and creation tools were developed for cinematic and musical production. These technologies free the music creator from the constraints of normalized loudspeaker configurations. They also support head rotations along three degrees of freedom (3-DoF), thus unlocking a natural immersive listening experience through headphones or wearable audio devices. Meanwhile, interactive audio experiences in video games and virtual or augmented reality require a scene representation supporting 6-DoF listener navigation, where an audio object models a natural sound source having controllable distance, orientation, and directivity properties. Additionally, acoustic environment properties must be explicitly included and decoupled from the sound source description. We examine and compare these two conceptions of object-based spatial audio and seek to unify them with a view to connecting previously disparate digital media applications and industries.