Browsing by Author "Gelmez, Zeynep Ecem"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Conference Object Citation Count: 0Effect of Hand and Object Visibility in Navigational Tasks Based on Rotational and Translational Movements in Virtual Reality(Ieee Computer Soc, 2024) Batmaz, Anıl Ufuk; Gelmez, Zeynep Ecem; Batmaz, Anil Ufuk; Sarac, MineDuring object manipulation in Virtual Reality (VR) systems, realistically visualizing avatars and objects can hinder user performance and experience by complicating the task or distracting the user from the environment due to possible occlusions. Users might feel the urge to go through biomechanical changes, such as re-positioning the head to visualize the interaction area. In this paper, we investigate the effect of hand avatar and object visibility in navigational tasks using a VR headset. We performed two user studies where participants grasped a small, cylindrical object and navigated it through the virtual obstacles performing rotational or translational movements. We used three different visibility conditions for the hand avatar (opaque, transparent, and invisible) and two conditions for the object (opaque and transparent). Our results indicate that participants performed faster and with fewer collisions using the invisible and transparent hands compared to the opaque hand and fewer collisions with the opaque object compared to the transparent one. Furthermore, participants preferred to use the combination of the transparent hand avatar with the opaque object. The findings of this study might be useful to researchers and developers in deciding the visibility/transparency conditions of hand avatars and virtual objects for tasks that require precise navigational activities.Conference Object Citation Count: 0EyeGuide & EyeConGuide: Gaze-based Visual Guides to Improve 3D Sketching Systems(Assoc Computing Machinery, 2024) Batmaz, Anıl Ufuk; Gelmez, Zeynep Ecem; Batmaz, Anil Ufuk; Stuerzlinger, Wolfgang; Asente, Paul; Sarac, MineVisual guides help to align strokes and raise accuracy in Virtual Reality (VR) sketching tools. Automatic guides that appear at relevant sketching areas are convenient to have for a seamless sketching with a guide. We explore guides that exploit eye-tracking to render them adaptive to the user's visual attention. EYEGUIDE and EYECONGUIDE cause visual grid fragments to appear spatially close to the user's intended sketches, based on the information of the user's eye-gaze direction and the 3D position of the hand. Here we evaluated the techniques in two user studies across simple and complex sketching objectives in VR. The results show that gaze-based guides have a positive effect on sketching accuracy, perceived usability and preference over manual activation in the tested tasks. Our research contributes to integrating gaze-contingent techniques for assistive guides and presents important insights into multimodal design applications in VR.