AAU Student Projects - visit Aalborg University's student projects portal
A master's thesis from Aalborg University
Book cover


Deictic pointing and mobile device interaction techniques in 360-degree videos

Author

Term

4. term

Education

Publication year

2016

Submitted on

Pages

49

Abstract

Nutidens smartphones tilbyder flere måder at opleve 360°-videoer på: Du kan dreje udsigten ved at swipe på skærmen, ved at bevæge telefonen med dens bevægelsessensorer (IMU), eller ved at sætte den i et mobil-VR-headset. I mange situationer, fx på museumsrundvisninger, bruges deiktiske pegebevægelser – at pege for at lede andres opmærksomhed mod noget. Sådanne henvisninger kan ligge uden for det, man aktuelt ser. Dette projekt undersøgte, om og hvordan folk opfatter og kan identificere målobjektet for pegehenvisninger, når de vises i 360°-video uden for synsfeltet. Vi sammenlignede tre mobile interaktionsteknikker (klik-og-træk, IMU-baseret rotation og mobil VR) med en tilsvarende situation i den fysiske verden. Testene viste, at deltagerne klarede sig markant bedre i den virkelige verden end i 360°-video, når de skulle opdage pegebevægelser og finde det, der blev peget på. De tre mobile teknikker gav derimod lignende resultater. Det tyder på, at valget mellem disse teknikker ikke gør en væsentlig forskel, når 360°-videoer indeholder deiktiske pegebevægelser.

Today’s smartphones offer several ways to explore 360-degree videos: you can swipe to rotate the view, move the phone using its motion sensors (IMU), or place it in a mobile VR headset. In many real-world settings, such as guided museum tours, people use deictic pointing—pointing to direct someone’s attention—to refer to objects or areas, sometimes outside what the viewer is currently looking at. This study examined whether people can notice such pointing and identify what is being pointed at when the gesture is outside the current field of view in 360-degree video. We compared three mobile interaction techniques (click-and-drag, IMU-based rotation, and mobile virtual reality) with an equivalent scenario in the real world. The tests showed that participants performed significantly better in the real world than with any of the digital techniques when detecting pointing and finding its target. The three mobile techniques performed similarly, suggesting that, for 360-degree videos with deictic pointing, the choice among these interaction methods may not make a meaningful difference.

[This abstract was generated with the help of AI]