Role: Lead UX Researcher & Author
Partner: Akshita (Engineering & Technical Implementation)
Tools: Unity, Oculus Quest 2, Google Sheets
Timeline: Spring 2025 (testing 1st round complete) | Summer 2025 (testing 2nd round + analysis + draft refinement)
This study builds on the findings of Phase One by investigating how gamified auditory and visual feedback in VR affects spatial motor control across neurotypes. The goal was to test whether snap-to-place visual cues and quack sound effects improve task performance and engagement, particularly in neurodivergent users.
Participants used the Oculus Quest 2 headset to place virtual duck objects into target outlines in two randomized conditions:
Each participant completed both conditions (within-subjects design).