On December 1st, 2025, Research Associate Gili Ron will present her work on sensor-supported, bidirectional communication HRC at OzCHI 2025.
Abstract:
This paper presents a human-centered cyber-physical system integrating gesture recognition, depth sensing, and motion sensing to support human–robot collaboration (HRC) in timber assembly. The system enables users to control cobots on demand, adapts cobot trajectories to workspace changes, and ensures safety through motiontriggered
stops, while a visual interface displays real-time sensor data and planned actions. We evaluated the system in a small-scale timber assembly study with 21 participants—Novice Academics, Experienced Academics, and Novice Professionals—performing assembly tasks under three modes: Human Agency (gesture control), Robot Agency (autonomous sensing), and Combined Agency (integrated). Results show that gesture control achieved the highest perceived usability and the lowest mental and physical demand, whereas autonomous sensing produced the fastest completion times (median 6.02 min). The combined mode, intended to merge both paradigms, paradoxically led to slower performance (median 10.15 min) and higher cognitive load, as users had to coordinate overlapping control
schemes. Although some task-time data for professionals were unavailable, qualitative feedback revealed consistent trends: academics valued autonomy and transparency, while professionals prioritized safety and reliability.
These findings highlight that user background profoundly shapes collaboration preferences and that flexible interfaces—allowing adjustable safety thresholds, feedback modes, and levels of autonomy— are essential for effective and inclusive HRC design in construction contexts.