WAVE: Interactive Wave-based Sound Propagation for Virtual Environments

Ravish Mehra     Atul Rungta     Abhinav Golas     Ming Lin     Dinesh Manocha
University of North Carolina at Chapel Hill


We present an interactive wave-based sound propagation system that generates accurate, realistic sound in virtual environments for dynamic (moving) sources and listeners. We propose a novel algorithm to accurately solve the wave equation for dynamic sources and listeners using a combination of precomputation techniques and GPU-based runtime evaluation. Our system can handle large environments typically used in VR applications, compute spatial sound corresponding to listener’s motion (including head tracking) and handle both omnidirectional and directional sources, all at interactive rates. As compared to prior wave-based techniques applied to large scenes with moving sources, we observe significant improvement in runtime memory. The overall soundpropagation and rendering system has been integrated with the Half-Life 2 game engine, Oculus-Rift head-mounted display, and the Xbox game controller to enable users to experience high-quality acoustic effects (e.g., amplification, diffraction low-passing, highorder scattering) and spatial audio, based on their interactions in the VR application. We provide the results of preliminary user evaluations, conducted to study the impact of wave-based acoustic effects and spatial audio on users’ navigation performance in virtual environments.

Ravish Mehra, Atul Rungta, Abhinav Golas, Ming Lin, and Dinesh Manocha. WAVE: Interactive Wave-based Sound Propagation for Virtual Environments

Preprint (PDF, 6.5 MB), (IEEE VR 2015, Proceedings of IEEE TVCG)

Video (WMV, 60 MB)