• Márton Havasi
  • Andriy Bogdanov
  • Oliver Emil Trudslev Hansen
This project aims to provide a mean of interaction with sound zones through Augmented Reality (AR). To realise this task an Android mobile application has been developed. The underlying problem associated with the sound zone interaction has been established to be the hidden affordance of the sound zones. To address this problem it has been suggested to utilise AR to create the sound zone visualisers with an option for user interaction. A research into the interaction methods with AR has been conducted, where touch-screen gestures have been established as a tool to achieve the interaction goal. This report provides a thorough description of the Design process which elaborates on the design choices through development. The user interaction options have been outlined and a set of requirements has been established. Several candidates for the sound zone visualisations have been created via the GV Design Sprint method, where the exact features and properties of the proposed solution have been described. The design sprint resulted in 3 visualisation concepts and 4 different shapes the sound zones could take on. The Implementation part of this report has thoroughly described the architecture of the system along with the usage of ARCore and Sceneform to implement AR interaction. The usability of the application and the designed visualisations has been tested, where test participant feedback has been implemented in the second iteration of implementation. Based on the results it can be concluded that the developed application allows for interaction with sound zones through AR.
Publication date3 Jun 2021
Number of pages125
ID: 413545872