Computational modeling of mood (CMoM) based on neurophysiological stimuli of effect /

Recent studies have shown the possibilities of computers to understand, analyze and respond to human emotions through speech, facial expressions and neuro-physiological interface. Although emotion plays an important role in human communication, health and brain development, it only provides immediat...

Full description

Saved in:
Bibliographic Details
Main Author: Handayani,Dini Oktarina Dwi (Author)
Format: Thesis
Language:English
Published: Kuala Lumpur : Kulliyyah Information and Communication Technology, International Islamic University Malaysia, 2017
Subjects:
Online Access:http://studentrepo.iium.edu.my/handle/123456789/5656
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Recent studies have shown the possibilities of computers to understand, analyze and respond to human emotions through speech, facial expressions and neuro-physiological interface. Although emotion plays an important role in human communication, health and brain development, it only provides immediate responses contingent on the situation or environment. Research has shown that mood seems to be responsible for long-term disorder like stress, anxiety and depression. To date, little effort has been made to enable computer software to understand and analyze mood. Thus, in this study, a computational model of mood (CMoM) was proposed to recognize mood based on several sequences of the short-term emotional states namely emotion. The input for CMoM was derived in the form of electroencephalogram (EEG) signals. Since depression, anxiety and stress can affect decision-making processes, this research also analyzes human mood disorder in relation to CMoM. The Depression Anxiety Stress Scales (DASS-21) is used to validate the findings. The basic emotions comprising happiness, fear, sadness and calmness of the International Affective Picture System (IAPS) were used as the basis emotion function in deriving the valence and arousal of the affective space model (ASM) for use in generating the CMoM. EEG signals were captured and recorded from every subject while they performed given tasks in this research. Features were extracted using the Mel-Frequency Cepstral Coefficient (MFCC) and classified using the Multilayer Perceptron (MLP). Results of the analysis show that using CMoM to study the relationship between emotional state and mood is plausible. In addition, emotional videos were created using a collection of emotional video databases and used in the experiments. These emotional videos were used to analyze and understand the effect of watching these videos on human mood and perceptions. All these videos were rated and compared using both the CMoM and the Self-Assessment Manikin (SAM). These emotional videos will be made available for use by researchers working in the field of affective computing. This research also found that watching emotional videos can trigger and alter individual moods, ultimately leading to mood swings. In summary, CMoM can help visualize and analyze human emotion and mood accurately and were shown to correlate with the DASS-21 psychometric analysis test for initial moods of subjects. This research provides insight into the way human emotional states change from one emotion to another and how they can alter the individual mood.
Item Description:Abstracts in English and Arabic.
"A thesis submitted in fulfilment of the requirement for the degree of Doctor of Philosophy in Computer Science." --On title page.
Physical Description:xix, 259 leaves : ill. ; 30cm.
Bibliography:Includes bibliographical references (leaves 187-204).