User Experience Using Physiological Measurements
Authors
Hubert, Benjamin ; Fuglsang, Michael Lausdahl ; Haxholm, Henrik
Term
4. term
Education
Publication year
2016
Submitted on
2016-06-14
Pages
43
Abstract
Brugeroplevelse (UX) og følelser er et stigende fokus i menneske–computer-interaktion (HCI). En voksende tilgang er at bruge fysiologiske signaler til at vurdere UX. Dette projekt omfatter to studier, der undersøger: kan fysiologiske målinger forudsige folks selvrapporterede følelser på Self-Assessment Manikin (SAM), og hvordan hænger signaler under systembrug sammen med signaler optaget senere i en cuet recall-session? Vi indsamlede hjernesignaler (EEG), hudledning (EDA), puls/hjerterytme (HR) og ansigtsbevægelser med Emotiv Epoc, Mindplace Thoughtstream, Arduino Pulse Sensor og Microsoft Kinect. I Studie 1 blev disse signaler sammen med SAM-vurderinger brugt til at træne en support vector machine (SVM), en maskinlæringsmodel, til at forudsige SAM-værdier. I Studie 2 blev data indsamlet fra grupper både under interaktion med systemet og igen under en recall-session, med forskellige tidsforsinkelser og udsættelse for stimuli imellem. Datasættene blev sammenlignet med Pearson-korrelation og ANOVA, som er standard statistiske tests. Studie 1 viste, at fysiologiske data forudsiger SAM-vurderinger markant bedre end at gætte, og at kombination af flere sensorer (sensorfusion) yderligere øger nøjagtigheden. Studie 2 fandt signifikante sammenhænge mellem signaler fra interaktion og cuet recall for EEG og EDA, og at disse korrelationer for EEG falder, når tidsforsinkelsen bliver større. Resultaterne peger på et stærkt potentiale for computerunderstøttet UX-evaluering baseret på fysiologiske målinger og understreger, at tidsrummet mellem oplevelse og cuet recall kan påvirke resultaterne.
User experience (UX) and emotions are increasingly studied in human–computer interaction (HCI). A growing approach is to use physiological signals to help evaluate UX. This project reports two studies that ask: can physiological measurements predict people’s self-reported feelings on the Self-Assessment Manikin (SAM), and how are signals recorded during system use related to signals recorded later in a cued-recall session? We collected brain activity (EEG), skin conductance (EDA), heart rate (HR), and facial movement data using Emotiv Epoc, Mindplace Thoughtstream, an Arduino Pulse Sensor, and Microsoft Kinect. In Study 1, these signals and SAM ratings were used to train a support vector machine (SVM), a machine-learning model, to predict SAM values. In Study 2, data were collected from groups both during interaction with the system and again during a recall session, with different time delays and exposure to stimuli in between. The two sets of data were compared using Pearson product–moment correlation and ANOVA, standard statistical tests. Study 1 showed that physiological data predict SAM ratings significantly better than guessing, and that combining multiple sensors (sensor fusion) further improves accuracy. Study 2 found significant relationships between interaction and cued-recall signals for EEG and EDA, and that these correlations for EEG decrease as the time delay grows. These findings suggest strong potential for computer-assisted UX evaluation using physiological sensing, and highlight that the time interval between the original experience and cued recall can affect results.
[This abstract was generated with the help of AI]
Keywords
Documents
