Batmaz, Anıl Ufuk

Loading...
Profile Picture
Name Variants
Batmaz, Anıl Ufuk
A.,Batmaz
A. U. Batmaz
Anıl Ufuk, Batmaz
Batmaz, Anil Ufuk
A.,Batmaz
A. U. Batmaz
Anil Ufuk, Batmaz
Batmaz, A.U.
Batmaz, Anil U.
Batmaz, Anil Ufuk K.
Job Title
Dr. Öğr. Üyesi
Email Address
Main Affiliation
Mechatronics Engineering
Status
Former Staff
Website
ORCID ID
Scopus Author ID
Turkish CoHE Profile ID
Google Scholar ID
WoS Researcher ID

Sustainable Development Goals

4

QUALITY EDUCATION
QUALITY EDUCATION Logo

1

Research Products

6

CLEAN WATER AND SANITATION
CLEAN WATER AND SANITATION Logo

0

Research Products

10

REDUCED INEQUALITIES
REDUCED INEQUALITIES Logo

0

Research Products

13

CLIMATE ACTION
CLIMATE ACTION Logo

0

Research Products

14

LIFE BELOW WATER
LIFE BELOW WATER Logo

0

Research Products

2

ZERO HUNGER
ZERO HUNGER Logo

0

Research Products

8

DECENT WORK AND ECONOMIC GROWTH
DECENT WORK AND ECONOMIC GROWTH Logo

0

Research Products

12

RESPONSIBLE CONSUMPTION AND PRODUCTION
RESPONSIBLE CONSUMPTION AND PRODUCTION Logo

0

Research Products

9

INDUSTRY, INNOVATION AND INFRASTRUCTURE
INDUSTRY, INNOVATION AND INFRASTRUCTURE Logo

0

Research Products

17

PARTNERSHIPS FOR THE GOALS
PARTNERSHIPS FOR THE GOALS Logo

0

Research Products

1

NO POVERTY
NO POVERTY Logo

0

Research Products

11

SUSTAINABLE CITIES AND COMMUNITIES
SUSTAINABLE CITIES AND COMMUNITIES Logo

0

Research Products

15

LIFE ON LAND
LIFE ON LAND Logo

0

Research Products

3

GOOD HEALTH AND WELL-BEING
GOOD HEALTH AND WELL-BEING Logo

4

Research Products

7

AFFORDABLE AND CLEAN ENERGY
AFFORDABLE AND CLEAN ENERGY Logo

0

Research Products

5

GENDER EQUALITY
GENDER EQUALITY Logo

0

Research Products

16

PEACE, JUSTICE AND STRONG INSTITUTIONS
PEACE, JUSTICE AND STRONG INSTITUTIONS Logo

0

Research Products
This researcher does not have a Scopus ID.
This researcher does not have a WoS ID.
Scholarly Output

29

Articles

5

Views / Downloads

6/0

Supervised MSc Theses

0

Supervised PhD Theses

0

WoS Citation Count

171

Scopus Citation Count

227

WoS h-index

8

Scopus h-index

9

Patents

0

Projects

0

WoS Citations per Publication

5.90

Scopus Citations per Publication

7.83

Open Access Source

4

Supervised Theses

0

JournalCount
IEEE Conference on Virtual Reality and 3D User Interfaces (VR) -- MAR 16-21, 2024 -- Orlando, FL3
22nd IEEE International Symposium on Mixed and Augmented Reality (ISMAR) -- OCT 16-20, 2023 -- Sydney, AUSTRALIA2
Conference on Human Factors in Computing Systems - Proceedings2
28th Acm Symposium on Virtual Reality Software and Technology, Vrst 20222
International Case Studies in Food Tourism2
Current Page: 1 / 5

Scopus Quartile Distribution

Competency Cloud

GCRIS Competency Cloud

Scholarly Output Search Results

Now showing 1 - 10 of 29
  • Conference Object
    Citation - WoS: 13
    Citation - Scopus: 14
    Performance Analysis of Saccades for Primary and Confirmatory Target Selection
    (Assoc Computing Machinery, 2022) Mutasim, Aunnoy K.; Batmaz, Anil Ufuk; Mughrabi, Moaaz Hudhud; Stuerzlinger, Wolfgang
    In eye-gaze-based selection, dwell suffers from several issues, e.g., the Midas Touch problem. Here we investigate saccade-based selection techniques as an alternative to dwell. First, we designed a novel user interface (UI) for Actigaze and used it with ( goal-crossing) saccades for confirming the selection of small targets (i.e., < 1.5-2 degrees). We compared it with three other variants of Actigaze (with button press, dwell, and target reverse crossing) and two variants of target magnification (with button press and dwell). Magnification-dwell exhibited the most promising performance. For Actigaze, goal-crossing was the fastest option but suffered the most errors. We then evaluated goal-crossing as a primary selection technique for normal-sized targets (>= 2 degrees) and implemented a novel UI for such interaction. Results revealed that dwell achieved the best performance. Yet, we identified goal-crossing as a good compromise between dwell and button press. Our findings thus identify novel options for gaze-only interaction.
  • Article
    Citation - WoS: 4
    Citation - Scopus: 5
    Evaluation of an Immersive Covid-19 Data Visualization
    (IEEE Computer Soc, 2023) Kaya, Furkan; Celik, Elif; Batmaz, Anil Ufuk K.; Mutasim, Aunnoy K.; Stuerzlinger, Wolfgang
    COVID-19 restrictions have detrimental effects on the population, both socially and economically. However, these restrictions are necessary as they help reduce the spread of the virus. For the public to comply, easily comprehensible communication between decision makers and the public is thus crucial. To address this, we propose a novel 3-D visualization of COVID-19 data, which could increase the awareness of COVID-19 trends in the general population. We conducted a user study and compared a conventional 2-D visualization with the proposed method in an immersive environment. Results showed that the our 3-D visualization approach facilitated understanding of the complexity of COVID-19. A majority of participants preferred to see the COVID-19 data with the 3-D method. Moreover, individual results revealed that our method increases the engagement of users with the data. We hope that our method will help governments to improve their communication with the public in the future.
  • Conference Object
    Citation - Scopus: 2
    When Anchoring Fails: Interactive Alignment of Large Virtual Objects in Occasionally Failing AR Systems
    (Springer Science and Business Media Deutschland GmbH, 2022) Batmaz, A.U.; Stuerzlinger, W.
    Augmented reality systems show virtual object models overlaid over real ones, which is helpful in many contexts, e.g., during maintenance. Assuming all geometry is known, misalignments in 3D poses will still occur without perfectly robust viewer and object 3D tracking. Such misalignments can impact the user experience and reduce the potential benefits associated with AR systems. In this paper, we implemented several interaction algorithms to make manual virtual object alignment easier, based on previously presented methods, such as HoverCam, SHOCam, and a Signed Distance Field. Our approach also simplifies the user interface for manual 3D pose alignment in 2D input systems. The results of our work indicate that our approach can reduce the time needed for interactive 3D pose alignment, which improves the user experience. © 2022, The Author(s), under exclusive license to Springer Nature Switzerland AG.
  • Article
    Citation - WoS: 4
    Citation - Scopus: 4
    The Guided Evaluation Method: an Easier Way To Empirically Estimate Trained User Performance for Unfamiliar Keyboard Layouts
    (Academic Press Ltd- Elsevier Science Ltd, 2024) Mutasim, Aunnoy K.; Batmaz, Anil Ufuk; Mughrabi, Moaaz Hudhud; Stuerzlinger, Wolfgang
    To determine in a user study whether proposed keyboard layouts, such as OPTI, can surpass QWERTY in performance, extended training through longitudinal studies is crucial. However, addressing the challenge of creating trained users presents a logistical bottleneck. A common alternative involves having participants type the same word or phrase repeatedly. We conducted two separate studies to investigate this alternative. The findings reveal that both approaches, repeatedly typing words or phrases, have limitations in accurately estimating trained user performance. Thus, we propose the Guided Evaluation Method (GEM), a novel approach to quickly estimate trained user performance with novices. Our results reveal that in a matter of minutes, participants exhibited performance similar to an existing longitudinal study - OPTI outperforms QWERTY. As it eliminates the need for resource-intensive longitudinal studies, our new GEM thus enables much faster estimation of trained user performance. This outcome will potentially reignite research on better text entry methods.
  • Conference Object
    Citation - WoS: 5
    Citation - Scopus: 10
    EyeGuide & EyeConGuide: Gaze-based Visual Guides to Improve 3D Sketching Systems
    (Assoc Computing Machinery, 2024) Turkmen, Rumeysa; Gelmez, Zeynep Ecem; Batmaz, Anil Ufuk; Stuerzlinger, Wolfgang; Asente, Paul; Sarac, Mine
    Visual guides help to align strokes and raise accuracy in Virtual Reality (VR) sketching tools. Automatic guides that appear at relevant sketching areas are convenient to have for a seamless sketching with a guide. We explore guides that exploit eye-tracking to render them adaptive to the user's visual attention. EYEGUIDE and EYECONGUIDE cause visual grid fragments to appear spatially close to the user's intended sketches, based on the information of the user's eye-gaze direction and the 3D position of the hand. Here we evaluated the techniques in two user studies across simple and complex sketching objectives in VR. The results show that gaze-based guides have a positive effect on sketching accuracy, perceived usability and preference over manual activation in the tested tasks. Our research contributes to integrating gaze-contingent techniques for assistive guides and presents important insights into multimodal design applications in VR.
  • Conference Object
    Citation - WoS: 3
    Citation - Scopus: 4
    Effect of Hand and Object Visibility in Navigational Tasks Based on Rotational and Translational Movements in Virtual Reality
    (Ieee Computer Soc, 2024) Hatira, Amal; Gelmez, Zeynep Ecem; Batmaz, Anil Ufuk; Sarac, Mine
    During object manipulation in Virtual Reality (VR) systems, realistically visualizing avatars and objects can hinder user performance and experience by complicating the task or distracting the user from the environment due to possible occlusions. Users might feel the urge to go through biomechanical changes, such as re-positioning the head to visualize the interaction area. In this paper, we investigate the effect of hand avatar and object visibility in navigational tasks using a VR headset. We performed two user studies where participants grasped a small, cylindrical object and navigated it through the virtual obstacles performing rotational or translational movements. We used three different visibility conditions for the hand avatar (opaque, transparent, and invisible) and two conditions for the object (opaque and transparent). Our results indicate that participants performed faster and with fewer collisions using the invisible and transparent hands compared to the opaque hand and fewer collisions with the opaque object compared to the transparent one. Furthermore, participants preferred to use the combination of the transparent hand avatar with the opaque object. The findings of this study might be useful to researchers and developers in deciding the visibility/transparency conditions of hand avatars and virtual objects for tasks that require precise navigational activities.
  • Conference Object
    Citation - Scopus: 12
    Exploring Discrete Drawing Guides To Assist Users in Accurate Mid-Air Sketching in Vr
    (Association for Computing Machinery, 2022) Türkmen, R.; Pfeuffer, K.; Barrera MacHuca, M.D.; Batmaz, A.U.; Gellersen, H.
    Even though VR design applications that support sketching are popular, sketching accurately in mid-air is challenging for users. In this paper, we explore discrete visual guides that assist users' stroke accuracy and drawing experience inside the virtual environment. We also present an eye-tracking study that compares continuous, discrete, and no guide in a basic drawing task. Our experiment asks participants to draw a circle and a line using three different guide types, three different sizes and two different orientations. Results indicate that discrete guides are more user-friendly than continuous guides, as the majority of participants preferred their use, while we found no difference in speed/accuracy compared to continuous guides. Potentially, this can be attributed to distinct eye-gaze strategies, as discrete guides led users to shift their eyes more frequently between guide points and the drawing cursor. Our insights are useful for practitioners and researchers in 3D sketching, as they are a first step to inform future design applications of how visual guides inside the virtual environment affect visual behaviour and how eye-gaze can become a tool to assist sketching. © 2022 ACM.
  • Conference Object
    Citation - WoS: 7
    Citation - Scopus: 9
    Effects of Opaque, Transparent and Invisible Hand Visualization Styles on Motor Dexterity in a Virtual Reality Based Purdue Pegboard Test
    (Ieee Computer Soc, 2023) Voisard, Laurent; Hatira, Amal; Sarac, Mine; Kersten-Oertel, Marta; Batmaz, Anil Ufuk
    The virtual hand interaction technique is one of the most common interaction techniques used in virtual reality (VR) systems. A VR application can be designed with different hand visualization styles, which might impact motor dexterity. In this paper, we aim to investigate the effects of three different hand visualization styles transparent, opaque, and invisible - on participants' performance through a VR-based Purdue Pegboard Test (PPT). A total of 24 participants were recruited and instructed to place pegs on the board as quickly and accurately as possible. The results indicated that using the invisible hand visualization significantly increased the number of task repetitions completed compared to the opaque hand visualization. However, no significant difference was observed in participants' preference for the hand visualization styles. These findings suggest that an invisible hand visualization may enhance performance in the VR-based PPT, potentially indicating the advantages of a less obstructive hand visualization style. We hope our results can guide developers, researchers, and practitioners when designing novel virtual hand interaction techniques.
  • Article
    Haptic-Assisted Soldering Training Protocol in Virtual Reality: The Impact of Scaffolded Guidance
    (Institute of Electrical and Electronics Engineers Inc., 2025) Yilmaz, M.; Batmaz, A.U.; Sarac, M.
    In this paper, we present a virtual training platform for soldering based on immersive visual feedback (i.e., a Virtual Reality (VR) headset) and scaffolded guidance (i.e., disappearing throughout the training) provided through a haptic device (Phantom Omni). We conducted a between-subject user study experiment with four conditions (2D monitor with no guidance, VR with no guidance, VR with constant, active guidance, and VR with scaffolded guidance) to evaluate their performance in terms of procedural memory, motor skills in VR, and skill transfer to real life. Our results showed that the scaffolded guidance offers the most effective transitioning from the virtual training to the real-life task — even though the VR with no guidance group has the best performance during the virtual training. These findings are critical for the industry and academy looking for safer and more effective training techniques, leading to better learning outcomes in real-life implementations. Furthermore, this work offers new insights into further haptic research in skill transfer and learning approaches while offering information on the possibilities of haptic-assisted VR training for complex skills, such as welding and medical stitching. © 2025 Elsevier B.V., All rights reserved.
  • Conference Object
    Citation - WoS: 3
    Citation - Scopus: 5
    Effect of Grip Style on Peripersonal Target Pointing in Vr Head Mounted Displays
    (Ieee Computer Soc, 2023) Batmaz, Anil Ufuk; Turkmen, Rumeysa; Sarac, Mine; Machuca, Mayra Donaji Barrera; Stuerzlinger, Wolfgang
    When working in Virtual Reality (VR), the user's performance is affected by how the user holds the input device (e.g., controller), typically using either a precision or a power grip. Previous work examined these grip styles for 3D pointing at targets at different depths in peripersonal space and found that participants had a lower error rate with the precision grip but identified no difference in movement speed, throughput, or interaction with target depth. Yet, this previous experiment was potentially affected by tracking differences between devices. This paper reports an experiment that partially replicates and extends the previous study by evaluating the effect of grip style on the 3D selection of nearby targets with the same device. Furthermore, our experiment re-investigates the effect of the vergence-accommodation conflict (VAC) present in current stereo displays on 3D pointing in peripersonal space. Our results show that grip style significantly affects user performance. We hope that our results are useful for researchers and designers when creating virtual environments.