AAU Student Projects - visit Aalborg University's student projects portal
A master's thesis from Aalborg University
Book cover


Physics-Based Real-Time Sound Synthesis for Virtual Reality Musical Instruments: State of the Art, Design and Implementation of VRMIs

Author

Term

4. Term

Publication year

2017

Pages

68

Abstract

Fremskridt inden for Virtual Reality (VR)-teknologi åbner nye måder at spille og designe musikinstrumenter på. Avanceret bevægelsessporing og uendelige virtuelle miljøer gør det muligt at skabe nye former for musikalsk interaktion. Denne udvikling har ført til en ny kategori af udtryksfulde musikinterfaces: Virtual Reality Musical Instruments (VRMI). Specialet præsenterer interaktive VRMI-prototyper med integreret fysikbaseret lydsyntese—lyd, der skabes ved at simulere fysiske egenskaber og bevægelser. Prototyperne er udviklet gennem en iterativ designproces med brugervenlighedstest og evaluering af tværmodale associationer, dvs. hvordan sanseindtryk som syn og lyd hænger sammen. Brugervenlighedsevalueringen peger på behov for at øge robustheden i koblingen mellem brugerens bevægelser og de lydlige hændelser (gesture mapping) samt i systemets registrering af sammenstød mellem virtuelle objekter (kollisionsdetektion). I de tværmodale forsøg sås en svag tendens til, at deltagere kunne identificere størrelse og materiale på det lydskabende objekt. Da antal deltagere var begrænset, og forsøgsdesignet kan rumme svagheder, kræves yderligere arbejde og evaluering for at bekræfte disse indikationer.

Advances in Virtual Reality (VR) technology are opening new ways to play and design musical instruments. Sophisticated motion tracking and limitless virtual environments enable novel forms of musical interaction. This has contributed to a new category within expressive musical interfaces: Virtual Reality Musical Instruments (VRMI). This thesis presents interactive VRMI prototypes that integrate physics-based sound synthesis—sound generated by simulating the physical behavior of objects. The prototypes were developed through an iterative design process that included usability testing and an evaluation of crossmodal associations, that is, how sensory cues such as what is seen and what is heard relate to each other. Usability findings indicate a need to improve the robustness of the link between user gestures and resulting sound events (gesture mapping) and the system that detects collisions between virtual objects (collision detection). In the crossmodal experiments, there was a slight tendency for participants to identify the size and material of the sound-producing object. However, because the sample size was small and the experimental design may have flaws, further work and evaluation are needed to validate these indications.

[This abstract was generated with the help of AI]