Interactive Virtual Percussion Instruments on Mobile Devices

Zhimin Ren and Ming C. Lin

Department of Computer Science
University of North Carolina at Chapel Hill





ABSTRACT

We present a multimodal virtual percussion instrument system on consumer mobile devices that allows users to design and configure customizable virtual percussion instruments and interact with them in real time. Users can create virtual instruments of different materials and shapes interactively, by editing and selecting the desired characteristics. Both the visual and auditory feedback are then computed on the fly to automatically correspond to the instrument properties and user interaction. We utilize efficient 3D input processing algorithms to approximate and represent real-time multi-touch input with key meta properties and adopt fast physical modeling to synthesize sounds. Despite the relatively limited computing resources on mobile devices, we are able to achieve rich and responsive multimodal feedback based on real-time user input. A pilot study is conducted to assess the effectiveness of the system.


PUBLICATION

Interactive Virtual Percussion Instruments on Mobile Devices
Proceedings of ACM VRST, Nov. 2015.

PREPRINT (PDF)


DEMO VIDEOS

  • High Res (41 MB)
  • Low Res (15 MB)