Strategic Companions - Behavioral Evidence of Trust and Reciprocity in Human-Robot Economic Games
Author
Term
4. term
Education
Publication year
2025
Submitted on
2025-10-22
Abstract
Robots are increasingly integrated across a wide range of application domains and are evolving toward more socially capable forms of interaction. As a result, the quality and reliability of human–robot interaction (HRI) have become critical factors in determining the success of human–robot collaboration. This thesis investigates HRI through a game-theoretic framework, focusing on trust and reciprocity as two fundamental components that shape cooperative behavior between humans and robots. The aim of this thesis is to analyze to what extent people trust and act reciprocally toward robots. This study employs the trust game, a well-established framework from behavioral economics, as the approach for two experiments. First, participants interacted both with the LuxAI humanoid robot and with a human partner to examine potential differences in trust levels across these conditions. In the second experiment, participants received a favor from the robot prior to the game to determine whether this gesture would elicit reciprocal behavior toward the robot. In the first experiment, participants sent similar numbers of points to both human and robot partners. However, receivers’ behavior differed between the human and robot conditions. Overall, participants returned significantly less to the robot than to the humans. In the second experiment, the favor was associated with lower subsequent transfers, although this effect was not statistically significant. These findings align with prior research demonstrating that robots can be perceived as cooperative and social partners. The results also highlight the dynamic nature of trust and reciprocity, emphasizing the need for systematic investigation of these processes in human–robot interaction.
Keywords
Documents
