Evaluating the utility of two gestural discomfort evaluation methods
April
Evaluating the utility of two gestural discomfort evaluation methods
Minseok Son 0 1
Jaemoon Jung 0 1
Woojin Park 0 1
0 Department of Industrial Engineering, Seoul National University , Seoul , South Korea
1 Editor: Manabu Sakakibara, Tokai University , JAPAN
Evaluating physical discomfort of designed gestures is important for creating safe and usable gesture-based interaction systems; yet, gestural discomfort evaluation has not been extensively studied in HCI, and few evaluation methods seem currently available whose utility has been experimentally confirmed. To address this, this study empirically demonstrated the utility of the subjective rating method after a small number of gesture repetitions (a maximum of four repetitions) in evaluating designed gestures in terms of physical discomfort resulting from prolonged, repetitive gesture use. The subjective rating method has been widely used in previous gesture studies but without empirical evidence on its utility. This study also proposed a gesture discomfort evaluation method based on an existing ergonomics posture evaluation tool (Rapid Upper Limb Assessment) and demonstrated its utility in evaluating designed gestures in terms of physical discomfort resulting from prolonged, repetitive gesture use. Rapid Upper Limb Assessment is an ergonomics postural analysis tool that quantifies the work-related musculoskeletal disorders risks for manual tasks, and has been hypothesized to be capable of correctly determining discomfort resulting from prolonged, repetitive gesture use. The two methods were evaluated through comparisons against a baseline method involving discomfort rating after actual prolonged, repetitive gesture use. Correlation analyses indicated that both methods were in good agreement with the baseline. The methods proposed in this study seem useful for predicting discomfort resulting from prolonged, repetitive gesture use, and are expected to help interaction designers create safe and usable gesture-based interaction systems.
-
Data Availability Statement: All relevant data are
within the paper.
Funding: This work was supported by the Global
Frontier R&D Program on <Human-centered
Interaction for Coexistence> funded by the National
Research Foundation of Korea grant funded by the
Korean Government (MEST) (NRF-M1AXA003±
2011- 0031425). The funder had no role in study
design, data collection and analysis, decision to
publish, or preparation of the manuscript.
Competing interests: The authors have declared
that no competing interests exist.
Introduction
Gestures involving upper-extremity postures or motions (hereafter, simply UE gestures or
gestures) are among the most basic means of human communication, along with speech and
facial expressions [
1
]. People use gestures to complement and supplement speech. Also,
gesture can stand on its own to substitute for speech [
2,3
]Ðthe deaf communities indeed use sign
languages composed of gestures for everyday communication.
As gesture recognition technologies mature and become more available, designed UE
gestures are being increasingly utilized for human-machine interaction (HMI) [4±6]. Currently,
applications of gesture-based interaction include machine/device control [
7,8
], data
exploration in virtual reality [9], artistic creation [
10
], musical instrument playing [
11
], and games
[12±15]. Gestures are also being utilized to evoke functions and commands during routine
computer tasks, such as word processing [
16
] and web browsing [17±20], partly replacing the
traditional keyboard and mouse-based input system. When working with many of these
applications, human user may need to perform gestures repetitively over a long duration.
Prolonged, repetitive use of UE gestures can give rise to physical discomfort especially
when the designed gestures are awkward and stressful. Excessive physical discomfort can
adversely affect work productivity [
21
] and deteriorate user experience. It is also known to
be associated with increased risks of musculoskeletal disorders [22±27]. Multiple previous
research studies have reported that sign language interpreters, who use UE gestures
extensively, experience musculoskeletal discomfort and pain, and, are at an increased risk of
upperextremity musculoskeletal disorders [28±31]. The design of gestures for HMI must accomplish
controlling the physical discomfort associated with their use by identifying and excluding
stressful design alternatives. Such discomfort control would be a necessary condition for wide
acceptance of gesture-based interaction.
An ability to adequately evaluate the discomfort levels of different gesture design
alternatives is required as a prerequisite to achieving physical discomfort control in the design of
gestures. Especially, a means for characterizing and classifying gestures in terms of level of
discomfort resulting from prolonged and repetitive gesture use, which is the operating condition
of interest, is needed (...truncated)