AAU Student Projects - visit Aalborg University's student projects portal
A master's thesis from Aalborg University
Book cover


Understanding UI In Crowdsourcing By Mitigating Unconscious Bias In Jobadvertisment

Authors

;

Term

4. Term

Publication year

2021

Pages

12

Abstract

Ulighed på arbejdspladsen kan forstærkes af subtilt, kønnet sprog i jobopslag, som påvirker hvem der føler sig opfordret til at søge. Denne afhandling undersøger, hvordan crowdsourcing-baseret grænsefladedesign kan hjælpe med at opdage og afbøde sådan ubevidst bias. Med afsæt i et forstudie designede vi prototypen CrowdCorrector og gennemførte et Wizard-of-Oz studie med ni deltagere, der vurderede tre jobopslag (sygeplejerske, forretningskonsulent, IT-konsulent) under tre interfacebetingelser: Erstatning, Forslag og Demografi. Vi udførte semistrukturerede interviews, analyserede interaktioner med tematisk kodning og fortolkede resultaterne i relation til Park m.fl.s Six Burdens for AI. Resultaterne viser, at Erstatning kræver hukommelse frem for genkendelse og kan være krævende; Forslag blev foretrukket, fordi det gav deltagerne indflydelse på den endelige formulering; og i Demografi fandt deltagerne flere UI-elementer nyttige men havde svært ved at finde annoterede ord, hvilket afslørede et misforhold mellem antaget sprogligt match og brugernes egen opfattelse. Studiet peger desuden på, hvor crowdsourcing kan overgå AI i at identificere biased sprog, og giver designimplikationer for HCI til at reducere sproglig bias i jobopslag og øge inklusionen blandt ansøgere.

Workplace inequality can be reinforced by subtle, gendered wording in job advertisements that shapes who feels encouraged to apply. This thesis examines how crowdsourcing-based interface design can help detect and mitigate such unconscious bias. Building on a pre-study, we designed a prototype, CrowdCorrector, and ran a Wizard-of-Oz study with nine participants who reviewed three job ads (Nurse, Business Consultant, IT Consultant) under three interface conditions: Replacement, Suggestion, and Demographic. We conducted semi-structured interviews, analyzed interactions using thematic coding, and interpreted results in relation to Park et al.'s Six Burdens of AI. Findings indicate that the Replacement condition depends on recall rather than recognition and can be demanding; the Suggestion condition was preferred because it let participants influence the final wording; and in the Demographic condition participants found some UI elements useful but struggled to locate annotated terms, revealing a mismatch between presumed language fit and users' own perceptions. The study also highlights where crowdsourcing can outperform AI for identifying biased language and offers HCI design implications to reduce linguistic bias in job ads and improve applicant inclusivity.

[This summary has been generated with the help of AI directly from the project (PDF)]