Browsing by Author "Turkmen,R."
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Conference Object Citation Count: 0Evaluating Voxel-Based Graphical Passwords for Virtual Reality(Institute of Electrical and Electronics Engineers Inc., 2024) Rawat,P.; Turkmen,R.; Nwagu,C.; Sunday,K.; Barrera MacHuca,M.D.Previous work has proposed using voxel-based graphical passwords (VGPs) for Virtual Reality (VR) as a secure, easy-to-remember way to authenticate users. Moreover, eye-tracking technology adds another level of security, as it avoids observational threats when entering the password. However, previous work has yet to evaluate the user performance, usability, and memorability of different combinations of VGPs. In two user studies, we first identified the best combination of shape and volume for VGPs. Then, we compare 3D versus 2D VGPs. Our results show that a cube is the best shape regarding usability and user preference. We also identified that 2D VGPs are easier to remember than 3D VGPs, as shown by a higher password accuracy and lower error rate. Our results inform the implementation of VGPs and other graphical passwords in VR. © 2024 IEEE.Conference Object Citation Count: 0EyeGuide & EyeConGuide: Gaze-based Visual Guides to Improve 3D Sketching Systems(Association for Computing Machinery, 2024) Turkmen,R.; Gelmez,Z.E.; Batmaz,A.U.; Stuerzlinger,W.; Asente,P.; Sarac,M.; Machuca,M.D.B.Visual guides help to align strokes and raise accuracy in Virtual Reality (VR) sketching tools. Automatic guides that appear at relevant sketching areas are convenient to have for a seamless sketching with a guide. We explore guides that exploit eye-tracking to render them adaptive to the user's visual attention. EyeGuide and EyeConGuide cause visual grid fragments to appear spatially close to the user's intended sketches, based on the information of the user's eye-gaze direction and the 3D position of the hand. Here we evaluated the techniques in two user studies across simple and complex sketching objectives in VR. The results show that gaze-based guides have a positive effect on sketching accuracy, perceived usability and preference over manual activation in the tested tasks. Our research contributes to integrating gaze-contingent techniques for assistive guides and presents important insights into multimodal design applications in VR. © 2024 Copyright held by the owner/author(s)