Subtask-Based Virtual Hand Visualization Method for Enhanced User Accuracy in Virtual Reality Environments

Loading...
Publication Logo

Date

2024

Journal Title

Journal ISSN

Volume Title

Publisher

Ieee Computer Soc

Open Access Color

Green Open Access

No

OpenAIRE Downloads

OpenAIRE Views

Publicly Funded

No
Impulse
Average
Influence
Average
Popularity
Average

Research Projects

Journal Issue

Abstract

In the virtual hand interaction techniques, the opacity of the virtual hand avatar can potentially obstruct users' visual feedback, leading to detrimental effects on accuracy and cognitive load. Given that the cognitive load is related to gaze movements, our study focuses on analyzing the gaze movements of participants across opaque, transparent, and invisible hand visualizations in order to create a new interaction technique. For our experimental setup, we used a Purdue Pegboard Test with reaching, grasping, transporting, and inserting subtasks. We examined how long and where participants concentrated on these subtasks and, using the findings, introduced a new virtual hand visualization method to increase accuracy. We hope that our results can be used in future virtual reality applications where users have to interact with virtual objects accurately.

Description

Gemici, Mucahit/0009-0004-4655-4743; Hatira, Amal/0009-0006-6452-0672; Bashar, Mohammad Raihanul/0000-0002-5271-457X

Keywords

Human-centered computing, Visualization, Visualization techniques, Human-centered computing, Visualization, Visualization design and evaluation methods

Fields of Science

Citation

WoS Q

Scopus Q

OpenCitations Logo
OpenCitations Citation Count
1

Source

IEEE Conference on Virtual Reality and 3D User Interfaces (VR) -- MAR 16-21, 2024 -- Orlando, FL

Volume

Issue

Start Page

6

End Page

11
PlumX Metrics
Citations

Scopus : 3

Captures

Mendeley Readers : 2

Google Scholar Logo
Google Scholar™
OpenAlex Logo
OpenAlex FWCI
2.33241011

Sustainable Development Goals