AAU Student Projects - visit Aalborg University's student projects portal
A master's thesis from Aalborg University
Book cover


An Augmented Reality Android Application For Interaction with Sound Zones

Authors

; ;

Term

4. term

Publication year

2021

Submitted on

Pages

125

Abstract

Dette projekt undersøger, hvordan lydzoner kan gøres synlige og nemme at styre ved hjælp af Augmented Reality (AR). Lydzoner forstås her som definerede områder, der er knyttet til bestemte lyde eller lydindstillinger. En central udfordring er, at disse zoner har en skjult affordance – det er svært for brugere at se, hvor zonerne er, og hvad de kan gøre med dem. For at løse det er der udviklet en Android-app, der lægger AR-visualiseringer oven på kamerabilledet og dermed viser lydzonernes placering og form, samtidig med at brugeren kan interagere med dem. Projektet kortlægger mulige interaktionsformer i AR og udpeger berøringsbevægelser på touchskærmen som et velegnet middel. Designprocessen beskriver de vigtigste valg undervejs, opstiller krav og skitserer, hvordan brugeren skal kunne handle i appen. Med GV Design Sprint-metoden – en hurtig og struktureret designproces – blev der udviklet flere udkast til visualiseringer. Resultatet var tre visualiseringskoncepter og fire mulige former, som lydzoner kan have. Implementeringen gennemgår systemarkitekturen og brugen af ARCore og Sceneform til at realisere AR-interaktionen. Anvendeligheden af appen og visualiseringerne blev testet med brugere, og deres feedback blev indarbejdet i en anden udviklingsrunde. Samlet set viser resultaterne, at den udviklede app gør det muligt at se og interagere med lydzoner gennem AR.

This project explores how to make sound zones visible and easy to control using Augmented Reality (AR). Here, sound zones are treated as defined areas associated with specific sounds or audio settings. A key challenge is their hidden affordance—users cannot readily see where the zones are or how to use them. To address this, an Android app was built that overlays AR visualizations on the camera view to show the location and shape of the zones, while also enabling user interaction. The project reviews AR interaction methods and identifies touch-screen gestures as a suitable way to interact. The design process documents major decisions, lists user interaction options, and sets requirements. Using the GV Design Sprint—a fast, structured design method—several visualization candidates were created. This resulted in three visualization concepts and four possible zone shapes. The implementation details the system architecture and the use of ARCore and Sceneform to enable AR interaction. The app’s usability and the visualizations were tested with participants, and their feedback was incorporated into a second iteration. Overall, the results show that the developed app makes it possible to see and interact with sound zones through AR.

[This summary has been rewritten with the help of AI based on the project's original abstract]