Browsing by Author "Hatira,A."
Now showing 1 - 3 of 3
- Results Per Page
- Sort Options
Conference Object Citation Count: 0Effect of Hand and Object Visibility in Navigational Tasks Based on Rotational and Translational Movements in Virtual Reality(Institute of Electrical and Electronics Engineers Inc., 2024) Hatira,A.; Gelmez,Z.E.; Batmaz,A.U.; Sarac,M.During object manipulation in Virtual Reality (VR) systems, realistically visualizing avatars and objects can hinder user performance and experience by complicating the task or distracting the user from the environment due to possible occlusions. Users might feel the urge to go through biomechanical changes, such as re-positioning the head to visualize the interaction area. In this paper, we investigate the effect of hand avatar and object visibility in navigational tasks using a VR headset. We performed two user studies where participants grasped a small, cylindrical object and navigated it through the virtual obstacles performing rotational or translational movements. We used three different visibility conditions for the hand avatar (opaque, transparent, and invisible) and two conditions for the object (opaque and transparent). Our results indicate that participants performed faster and with fewer collisions using the invisible and transparent hands compared to the opaque hand and fewer collisions with the opaque object compared to the transparent one. Furthermore, participants preferred to use the combination of the transparent hand avatar with the opaque object. The findings of this study might be useful to researchers and developers in deciding the visibility/transparency conditions of hand avatars and virtual objects for tasks that require precise navigational activities. © 2024 IEEE.Conference Object Citation Count: 0Enhancing Robotic Performance: Analyzing Force and Torque Measurements for Predicting Execution Failures(Institute of Electrical and Electronics Engineers Inc., 2023) Arsan, Taner; Alsan,H.F.; Arsan,T.Robots play an important role in many sectors, automating processes and supplementing human talents. However, guaranteeing reliability is critical for effective integration and widespread adoption. As a result, forecasting and managing these errors is critical. This research examines force and torque measurements in order to better understand the causes and patterns of robot execution errors. We hope to build prediction models that improve robot design and performance, eventually boosting their reliability and efficacy, by using data analysis and machine learning approaches. This study's research aims include using a dataset of force and torque measurements to predict and define robot execution failures, We hope to uncover the complex links between force and torque measurements and failure types, find crucial signals or precursors to failures, and construct strong prediction models for correct failure categorization by tackling these research topics. This study contributes to data science by demonstrating the use of analytics approaches to improve the dependability and performance of robots in real-world scenarios. © 2023 IEEE.Conference Object Citation Count: 0Subtask-Based Virtual Hand Visualization Method for Enhanced User Accuracy in Virtual Reality Environments(Institute of Electrical and Electronics Engineers Inc., 2024) Voisard,L.; Hatira,A.; Bashar,M.R.; Gemici,M.; Sarac,M.; Kereten-Oertel,M.; Batmaz,A.U.In the virtual hand interaction techniques, the opacity of the virtual hand avatar can potentially obstruct users' visual feedback, leading to detrimental effects on accuracy and cognitive load. Given that the cognitive load is related to gaze movements, our study focuses on analyzing the gaze movements of participants across opaque, transparent, and invisible hand visualizations in order to create a new interaction technique. For our experimental setup, we used a Purdue Pegboard Test with reaching, grasping, transporting, and inserting subtasks. We examined how long and where participants concentrated on these subtasks and, using the findings, introduced a new virtual hand visualization method to increase accuracy. We hope that our results can be used in future virtual reality applications where users have to interact with virtual objects accurately. © 2024 IEEE.