Browsing by Author "Stuerzlinger,W."
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Conference Object Citation Count: 0Eye-Hand Coordination Training: A Systematic Comparison of 2D, VR, and AR Display Technologies and Task Instructions(Institute of Electrical and Electronics Engineers Inc., 2024) Aliza,A.; Zaugg,I.; Celik,E.; Stuerzlinger,W.; Ortega,F.R.; Batmaz,A.U.; Sarac,M.Previous studies on Eye-Hand Coordination Training (EHCT) focused on the comparison of user motor performance across different hardware with cross-sectional studies. In this paper, we compare user motor performance with an EHCT setup in Augmented Reality (AR), Virtual Reality (VR), and on a 2D touchscreen display in a longitudinal study. Through a ten-day user study, we thoroughly analyzed the motor performance of twenty participants with five task instructions focusing on speed, error rate, accuracy, precision, and none. As a novel evaluation criterion, we also analyzed the participants' performance in terms of effective throughput. The results showed that each task instruction has a different effect on one or more psychomotor characteristics of the trainee, which highlights the importance of personalized training programs. Regarding different display technologies, the majority of participants could see more improvement in VR than in 2D or AR. We also identified that effective throughput is a good candidate for monitoring overall motor performance progress in EHCT systems. © 2024 IEEE.Conference Object Citation Count: 0EyeGuide & EyeConGuide: Gaze-based Visual Guides to Improve 3D Sketching Systems(Association for Computing Machinery, 2024) Turkmen,R.; Gelmez,Z.E.; Batmaz,A.U.; Stuerzlinger,W.; Asente,P.; Sarac,M.; Machuca,M.D.B.Visual guides help to align strokes and raise accuracy in Virtual Reality (VR) sketching tools. Automatic guides that appear at relevant sketching areas are convenient to have for a seamless sketching with a guide. We explore guides that exploit eye-tracking to render them adaptive to the user's visual attention. EyeGuide and EyeConGuide cause visual grid fragments to appear spatially close to the user's intended sketches, based on the information of the user's eye-gaze direction and the 3D position of the hand. Here we evaluated the techniques in two user studies across simple and complex sketching objectives in VR. The results show that gaze-based guides have a positive effect on sketching accuracy, perceived usability and preference over manual activation in the tested tasks. Our research contributes to integrating gaze-contingent techniques for assistive guides and presents important insights into multimodal design applications in VR. © 2024 Copyright held by the owner/author(s)