Detecting & interpreting self-manipulating hand movements for student’s affect prediction
Akhtar Hussain
0
Abdul Rehman Abbasi
1
Nitin Afzulpurkar
0
0
Department of computer Science, Asian Institute of Technology Bangkok
, Bangkok,
Thailand
1
Design Engineering Laboratory
, KINPOE, Karachi,
Pakistan
Background: In this paper, we report on development of a non-intrusive student mental state prediction system from his (her) unintentional hand-touch-head (face) movements. Methods: Hand-touch-head (face) movement is a typical case of occlusion of otherwise easily detectable image features due to similar skin color and texture, however, in our proposed scheme, i.e., the Sobel-operated local binary pattern (SLBP) method using force field features. We code six different gestures of more than 100 human subjects, and use these codes as manual input to a three-layered Bayesian network (BN). The first layer holds mental state to gesture relationships obtained in an earlier study while the second layer embeds gesture and SLBP generated binary codes. Results: We find it very successful in separating hand (s) from face region in varying illuminating conditions. The proposed scheme when evaluated on a novel data set is found promising resulting with an accuracy of about 85%. Conclusion: The framework will be utilized for developing intelligent tutoring system.
-
build the better relationships in the community which is one of the successful aspects of
human life [3].
Lately, researchers from multi-disciplinary areas have been looking for incorporating
the similar kind of intelligence and care in modern computing systems. This may
benefit a number of real-world applications, e.g. patient mental health care, lie detection
and affective tutoring system [4,7]. The research work to date, concerned with knowing
the subjects affective (mental) states, is pre-dominantly, related to the facial expression
analysis.
Furthermore, such work is mostly limited to recognizing basic or prototypic emotional
categories [8], which are rare in real life spontaneous situation.
There exist a number of modalities and expressions that could be used for affect
recognition. Bodily expressions (other than those from the face), especially, the hand gestures
(both intentional and unintentional) are difficult to be examined for spontaneous
emotional analysis, though, they are considered important cues in conveying users intentions
or affect [9,11]. An apparent reason for this is the involvement of an error-prone,
expensive and very time consuming process of manual labeling of spontaneous emotional
expressions [12].
Many prototypes are proposed to develop the gestures to affect relationship theories
(that is still less explored area in psychology [13]), however, to best of our knowledge,
the majority of these efforts use an objective evaluation of affect without considering the
context or situation under which the subject experiences it.
More recently, [14] reports on analysis of a small but novel data set mentioning
situation-specific gesture to mental state relationships. They observe that the hand
gesture (reportedly the unintentional gestures), i.e. Chin Rest, Head Scratch, Ear Scratch,
Hands on Cheek, Eye Rub and Nose Itch probabilistically represent students
affective(mental) state in classroom settings. They report on obtaining self-reported affective
(mental) states namely Thinking, Recalling, Concentrating, Tired, Relaxed and
Satisfied. They envisage using these relationships for developing affective tutoring
application.
Long ago, [15] proposes and evaluates student behavior model using non-verbal clues.
[5] also proposes an intelligent tutoring system for children that observes how their
gestures are correlated to learning skills. [16] proposes using a multimodal approach, i.e.,
using conversational cues, body posture, and facial features, to determine when learners
are confused, bored or frustrated during tutoring sessions with an affect-sensitive
intelligent tutor. [7] explores relationship between students affective states and
engagement levels during learning with an expert tutor. Similarly, [17] attempts to identify
students behavior from physical movement during learning. We, however, notice that the
movements characterized as carrying affective information by [14], involves simple yet
difficult to be accurately tractable hand-touch-head (face) movements. In fact, when the
face region is occluded by hand (s), having same skin color and texture, it poses a great
challenge to machine vision based detection schemes.
Attempts to address the challenge mentioned above, are quite promising but are far
from state of the art [18]. Local Binary Patterns (LBPs), and Gabor filtering methods are
also used for face detection, especially for texture analysis in the image. In fact, many
earlier systems have considered these occlusions as noise but more recently, [19]
considers these as helpful clues when used in conjunction with facial expressions for real-time
emotion recognition. They report that LBP performs better than Gabor f (...truncated)