• Frederik Falk
  • Oliver Gyldenberg Hjermitslev
Currently, tetraplegics have limited opportunities to perform activities of daily living (ADL) independently from caregivers. This project explores the possibility of introducing a shared control system based on computer vision and multimodal intent prediction. Following a review of previous work, we design a solution to improve simple interactions necessary for ADL. This system utilizes galvanic skin response and a novel intent prediction method based on previous user input. An evaluation with 24 able-bodied people was conducted to gather both subjective and objective data about the interactions and performance. Evaluation shows that aggressive arbitration can be a hindrance in certain measurements, but a tradeoff exists which requires more work and a longer, more comprehensive evaluation to define.
LanguageEnglish
Publication date2019
Number of pages121
ID: 305033499