Browsing by Author "Kersten-Oertel, Marta"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Conference Object Citation Count: 0Effects of Opaque, Transparent and Invisible Hand Visualization Styles on Motor Dexterity in a Virtual Reality Based Purdue Pegboard Test(Ieee Computer Soc, 2023) Batmaz, Anıl Ufuk; Hatira, Amal; Sarac, Mine; Kersten-Oertel, Marta; Batmaz, Anil UfukThe virtual hand interaction technique is one of the most common interaction techniques used in virtual reality (VR) systems. A VR application can be designed with different hand visualization styles, which might impact motor dexterity. In this paper, we aim to investigate the effects of three different hand visualization styles transparent, opaque, and invisible - on participants' performance through a VR-based Purdue Pegboard Test (PPT). A total of 24 participants were recruited and instructed to place pegs on the board as quickly and accurately as possible. The results indicated that using the invisible hand visualization significantly increased the number of task repetitions completed compared to the opaque hand visualization. However, no significant difference was observed in participants' preference for the hand visualization styles. These findings suggest that an invisible hand visualization may enhance performance in the VR-based PPT, potentially indicating the advantages of a less obstructive hand visualization style. We hope our results can guide developers, researchers, and practitioners when designing novel virtual hand interaction techniques.Conference Object Citation Count: 0Subtask-Based Virtual Hand Visualization Method for Enhanced User Accuracy in Virtual Reality Environments(Ieee Computer Soc, 2024) Batmaz, Anıl Ufuk; Hatira, Amal; Bashar, Mohammad Raihanul; Gemici, Mucahit; Sarac, Mine; Kersten-Oertel, Marta; Batmaz, Anil UfukIn the virtual hand interaction techniques, the opacity of the virtual hand avatar can potentially obstruct users' visual feedback, leading to detrimental effects on accuracy and cognitive load. Given that the cognitive load is related to gaze movements, our study focuses on analyzing the gaze movements of participants across opaque, transparent, and invisible hand visualizations in order to create a new interaction technique. For our experimental setup, we used a Purdue Pegboard Test with reaching, grasping, transporting, and inserting subtasks. We examined how long and where participants concentrated on these subtasks and, using the findings, introduced a new virtual hand visualization method to increase accuracy. We hope that our results can be used in future virtual reality applications where users have to interact with virtual objects accurately.